We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This paper discusses and compares the methods of attitude scale construction of Thurstone (method of equal-appearing intervals), Likert (method of summated ratings), and Guttman (method of scale analysis), with special emphasis on the latter as one of the most recent and significant contributions to the field. Despite a certain lack of methodological precision, scale analysis provides a means of evaluating the uni-dimensionality of a set of items. If the criteria for uni-dimensionality are met, the interpretation of rank-order scores is made unambiguous, and efficiency of prediction from the set of items is maximized. The Guttman technique, however, provides no satisfactory means of selecting the original set of items for scale analysis. Preliminary studies indicated that both the Likert and the Thurstone methods tend to select scalable sets of items and that their functions in this respect are complementary. A method of combining the Likert and Thurstone methods in order to yield a highly scalable set of items is outlined. Sets of 14 items selected by the method have, in the two cases where the technique has been tried, yielded very satisfactory scalability.
To evaluate the design of I-Corps@NCATS as a translational scientist training program, we mapped specific elements of the program’s content and pedagogy to the characteristics of a translational scientist, as first defined by Gilliland et al. []: systems thinker, process innovator, boundary spanner, team player, and skilled communicator. Using a mixed-methods evaluation, we examined how the I-Corps@NCATS training program, delivered across twenty-two Clinical and Translational Science Award Hubs, impacted the development of these key translational scientist characteristics.
Methods:
We developed survey items to assess the characteristics of systems thinker, process innovator, boundary spanner, team player, and skilled communicator. Data were collected from a national sample of 281 participants in the I-Corps@NCATS program. Using post-then-retrospective-pre survey items, participants self-reported their ability to perform skills associated with each of the translational scientist characteristics. Additionally, two open-ended survey questions explored how the program shifted participants’ translational orientation, generating 211 comments. These comments were coded through a team-based, iterative process.
Results:
Respondents reported the greatest increases in self-assessed abilities related to systems thinking and skilled communication. Participants indicated the highest levels of abilities related to team player and boundary crosser. From the coding of open-ended comments, we identified two additional characteristics of translational scientists: intellectual humility and cognitive flexibility.
Conclusions:
Participation in I-Corps@NCATS accelerates translational science in two ways: 1) by teaching the process of scientific translation from research ideas to real-world solutions, and 2) by encouraging growth in the mindset and characteristics of a translational scientist.
Background: Efgartigimod, a human immunoglobulin G (IgG)1 antibody Fc fragment, blocks the neonatal Fc receptor, decreasing IgG recycling and reducing pathogenic IgG autoantibody levels. ADHERE assessed the efficacy and safety of efgartigimod PH20 subcutaneous (SC; co-formulated with recombinant human hyaluronidase PH20) in chronic inflammatory demyelinating polyneuropathy (CIDP). Methods: ADHERE enrolled participants with CIDP (treatment naive or on standard treatments withdrawn during run-in period) and consisted of open-label Stage A (efgartigimod PH20 SC once weekly [QW]), and randomized (1:1) Stage B (efgartigimod or placebo QW). Primary outcomes were clinical improvement (assessed with aINCAT, I-RODS, or mean grip strength; Stage A) and time to first aINCAT score deterioration (relapse; Stage B). Secondary outcomes included treatment-emergent adverse events (TEAEs) incidence. Results: 322 participants entered Stage A. 214 (66.5%) were considered responders, randomized, and treated in Stage B. Efgartigimod significantly reduced the risk of relapse (HR: 0.394; 95% CI: 0.25–0.61) versus placebo (p=0.000039). Reduced risk of relapse occurred in participants receiving corticosteroids, intravenous or SC immunoglobulin, or no treatment before study entry. Most TEAEs were mild to moderate; 3 deaths occurred, none related to efgartigimod. Conclusions: Participants treated with efgartigimod PH20 SC maintained a clinical response and remained relapse-free longer than those treated with placebo.
Behavioural treatments are recommended first-line for insomnia, but long-term benzodiazepine receptor agonist (BZRA) use remains common and engaging patients in a deprescribing consultation is challenging. Few deprescribing interventions directly target patients. Prescribers’ support of patient-targeted interventions may facilitate their uptake. Recently assessed in the Your Answers When Needing Sleep in New Brunswick (YAWNS NB) study, Sleepwell (mysleepwell.ca) was developed as a direct-to-patient behaviour change intervention promoting BZRA deprescribing and non-pharmacological insomnia management. BZRA prescribers of YAWNS NB participants were invited to complete an online survey assessing the acceptability of Sleepwell as a direct-to-patient intervention. The survey was developed using the seven construct components of the theoretical framework of acceptability (TFA) framework. Respondents (40/250, 17.2%) indicated high acceptability, with positive responses per TFA construct averaging 32.3/40 (80.7%). Perceived as an ethical, credible, and useful tool, Sleepwell also promoted prescriber–patient BZRA deprescribing engagements (11/19, 58%). Prescribers were accepting of Sleepwell and supported its application as a direct-to-patient intervention.
Rice in Mississippi is often in early seedling growth stages when paraquat-based herbicide treatments are commonly applied to corn, cotton, and soybean; therefore, off-target movement of the herbicide onto adjacent rice fields may occur. After an off-target movement event has occurred, weed management in the rice crop is still necessary. Field studies were conducted from 2019 to 2021 in Stoneville, MS, to evaluate rice injury and barnyardgrass control with labeled herbicides after exposure to a sublethal concentration of paraquat. Herbicide treatments were label-recommended rates of imazethapyr, quinclorac, propanil, bispyribac-sodium, cyhalopfop, and florpyrauxifen-benzyl applied following rice exposure to a sublethal concentration of paraquat. Rice injury was detected 7 and 28 d after treatment (DAT) and was ≥35% and ≥14%, respectively, for all herbicides. Florpyrauxifen-benzyl and imazethapyr caused the greatest rice injury at 28 DAT. Following paraquat exposure, barnyardgrass control was similar for all labeled herbicide treatments at 7, 14, and 28 DAT except for florpyrauxifen-benzyl and no herbicide (paraquat alone) at 7 DAT. Across all evaluations, barnyardgrass control was at least 12% greater following paraquat exposure and labeled herbicide treatments than with no paraquat exposure. The current research demonstrates that labeled rates of herbicides applied following exposure to a sublethal concentration of paraquat resulted in <36% injury and provided as great as 95% control of barnyardgrass, depending on the herbicide treatment. Therefore, the labeled herbicides choice following rice exposure to a sublethal concentration of paraquat should be based on weed spectrum.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Bilingualism is thought to confer advantages in executive functioning, thereby contributing to cognitive reserve and a later age of dementia symptom onset. While the relation between bilingualism and age of onset has been explored in Alzheimer's dementia, there are few studies examining bilingualism as a contributor to cognitive reserve in frontotemporal dementia (FTD). In line with previous findings, we hypothesized that bilinguals with behavioral variant FTD would be older at symptom onset compared to monolinguals, but that no such effect would be found in patients with nonfluent/agrammatic variant primary progressive aphasia (PPA) or semantic variant PPA. Contrary to our hypothesis, we found no significant difference in age at symptom onset between monolingual and bilingual speakers within any of the FTD variants, and there were no notable differences on neuropsychological measures. Overall, our results do not support a protective effect of bilingualism in patients with FTD-spectrum disease in a U.S. based cohort.
This 17-year prospective study applied a social-development lens to the challenge of identifying long-term predictors of adult depressive symptoms. A diverse community sample of 171 individuals was repeatedly assessed from age 13 to age 30 using self-, parent-, and peer-report methods. As hypothesized, competence in establishing close friendships beginning in adolescence had a substantial long-term predictive relation to adult depressive symptoms at ages 27–30, even after accounting for prior depressive, anxiety, and externalizing symptoms. Intervening relationship difficulties at ages 23–26 were identified as part of pathways to depressive symptoms in the late twenties. Somewhat distinct paths by gender were also identified, but in all cases were consistent with an overall role of relationship difficulties in predicting long-term depressive symptoms. Implications both for early identification of risk as well as for potential preventive interventions are discussed.
This study examined struggles to establish autonomy and relatedness with peers in adolescence and early adulthood as predictors of advanced epigenetic aging assessed at age 30. Participants (N = 154; 67 male and 87 female) were observed repeatedly, along with close friends and romantic partners, from ages 13 through 29. Observed difficulty establishing close friendships characterized by mutual autonomy and relatedness from ages 13 to 18, an interview-assessed attachment state of mind lacking autonomy and valuing of attachment at 24, and self-reported difficulties in social integration across adolescence and adulthood were all linked to greater epigenetic age at 30, after accounting for chronological age, gender, race, and income. Analyses assessing the unique and combined effects of these factors, along with lifetime history of cigarette smoking, indicated that each of these factors, except for adult social integration, contributed uniquely to explaining epigenetic age acceleration. Results are interpreted as evidence that the adolescent preoccupation with peer relationships may be highly functional given the relevance of such relationships to long-term physical outcomes.
Introduction. While many individuals quit smoking during pregnancy, most relapse within one year postpartum. Research into methods to decrease smoking relapse postpartum has been hampered by difficulties with recruitment. Method. We conducted individual interviews with pregnant women (N = 22) who were interested in quitting smoking while pregnant about their attitudes regarding smoking and quitting during pregnancy, clinical trial participation, and smoking cessation medication use. Results. Participants were aware of the risks of smoking while pregnant. Many wanted to quit smoking before delivery. Few used empirically supported treatments to quit. While research was viewed positively, interest in taking on new commitments postpartum and taking a medication to prevent relapse was low. Medication concerns were evident among most participants, especially among those planning to breastfeed. Further, several women noted medication was unnecessary, as they did not believe they would relapse postpartum. Financial incentives, childcare, and fewer and/or remote visits were identified as facilitators to participating in research. However, these factors did not outweigh women’s concerns about medication use and time commitments. Conclusions. Women are aware that quitting smoking during pregnancy and remaining smoke-free postpartum are important. However, beliefs that personal relapse risk is low and that medications are dangerous reduced enthusiasm for taking medication for postpartum relapse prevention. Future medication trials should educate women about the high likelihood of relapse, prepare to answer detailed questions about risks of cessation medications, and connect with participants’ clinicians. For new mothers, studies conducted remotely with few scheduled appointments would reduce barriers to participation.
Performance characteristics of SARS-CoV-2 nucleic acid detection assays are understudied within contexts of low pre-test probability, including screening asymptomatic persons without epidemiological links to confirmed cases, or asymptomatic surveillance testing. SARS-CoV-2 detection without symptoms may represent presymptomatic or asymptomatic infection, resolved infection with persistent RNA shedding, or a false-positive test. This study assessed the positive predictive value of SARS-CoV-2 real-time reverse transcription polymerase chain reaction (rRT-PCR) assays by retesting positive specimens from 5 pre-test probability groups ranging from high to low with an alternate assay.
Methods:
In total, 122 rRT-PCR positive specimens collected from unique patients between March and July 2020 were retested using a laboratory-developed nested RT-PCR assay targeting the RNA-dependent RNA polymerase (RdRp) gene followed by Sanger sequencing.
Results:
Significantly fewer (15.6%) positive results in the lowest pre-test probability group (facilities with institution-wide screening having ≤3 positive asymptomatic cases) were reproduced with the nested RdRp gene RT-PCR assay than in each of the 4 groups with higher pre-test probability (individual group range, 50.0%–85.0%).
Conclusions:
Large-scale SARS-CoV-2 screening testing initiatives among low pre-test probability populations should be evaluated thoroughly prior to implementation given the risk of false-positive results and consequent potential for harm at the individual and population level.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
A multi-disciplinary expert group met to discuss vitamin D deficiency in the UK and strategies for improving population intakes and status. Changes to UK Government advice since the 1st Rank Forum on Vitamin D (2009) were discussed, including rationale for setting a reference nutrient intake (10 µg/d; 400 IU/d) for adults and children (4+ years). Current UK data show inadequate intakes among all age groups and high prevalence of low vitamin D status among specific groups (e.g. pregnant women and adolescent males/females). Evidence of widespread deficiency within some minority ethnic groups, resulting in nutritional rickets (particularly among Black and South Asian infants), raised particular concern. Latest data indicate that UK population vitamin D intakes and status reamain relatively unchanged since Government recommendations changed in 2016. Vitamin D food fortification was discussed as a potential strategy to increase population intakes. Data from dose–response and dietary modelling studies indicate dairy products, bread, hens’ eggs and some meats as potential fortification vehicles. Vitamin D3 appears more effective than vitamin D2 for raising serum 25-hydroxyvitamin D concentration, which has implications for choice of fortificant. Other considerations for successful fortification strategies include: (i) need for ‘real-world’ cost information for use in modelling work; (ii) supportive food legislation; (iii) improved consumer and health professional understanding of vitamin D’s importance; (iv) clinical consequences of inadequate vitamin D status and (v) consistent communication of Government advice across health/social care professions, and via the food industry. These areas urgently require further research to enable universal improvement in vitamin D intakes and status in the UK population.
During March 27–July 14, 2020, the Centers for Disease Control and Prevention’s National Healthcare Safety Network extended its surveillance to hospital capacities responding to COVID-19 pandemic. The data showed wide variations across hospitals in case burden, bed occupancies, ventilator usage, and healthcare personnel and supply status. These data were used to inform emergency responses.
The systematic review examined the phenomenon of trust during public health emergency events. The literature reviewed was field studies done with people directly affected or likely to be affected by such events and included quantitative, qualitative, mixed-method, and case study primary studies in English (N = 38) as well as Arabic, Chinese, French, Russian, and Spanish (all non-English N = 30). Studies were mostly from high- and middle-income countries, and the event most covered was infectious disease. Findings from individual studies were first synthesized within methods and evaluated for certainty/confidence, and then synthesized across methods. The final set of 11 findings synthesized across methods identified a set of activities for enhancing trust and showed that it is a multi-faceted and dynamic concept.
Intensity in adolescent romantic relationships was examined as a long-term predictor of higher adult blood pressure in a community sample followed from age 17 to 31 years. Romantic intensity in adolescence – measured via the amount of time spent alone with a partner and the duration of the relationship – was predicted by parents’ psychologically controlling behavior and was in turn found to predict higher resting adult systolic and diastolic blood pressure even after accounting for relevant covariates. The prediction to adult blood pressure was partially mediated via conflict in nonromantic adult friendships and intensity in adult romantic relationships. Even after accounting for these mediators, however, a direct path from adolescent romantic intensity to higher adult blood pressure remained. Neither family income in adolescence nor trait measures of personality assessed in adulthood accounted for these findings. The results of this study are interpreted both as providing further support for the view that adolescent social relationship qualities have substantial long-term implications for adult health, as well as suggesting a potential physiological mechanism by which adolescent relationships may be linked to adult health outcomes.
The rapid spread of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) throughout key regions of the United States in early 2020 placed a premium on timely, national surveillance of hospital patient censuses. To meet that need, the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN), the nation’s largest hospital surveillance system, launched a module for collecting hospital coronavirus disease 2019 (COVID-19) data. We present time-series estimates of the critical hospital capacity indicators from April 1 to July 14, 2020.
Design:
From March 27 to July 14, 2020, the NHSN collected daily data on hospital bed occupancy, number of hospitalized patients with COVID-19, and the availability and/or use of mechanical ventilators. Time series were constructed using multiple imputation and survey weighting to allow near–real-time daily national and state estimates to be computed.
Results:
During the pandemic’s April peak in the United States, among an estimated 431,000 total inpatients, 84,000 (19%) had COVID-19. Although the number of inpatients with COVID-19 decreased from April to July, the proportion of occupied inpatient beds increased steadily. COVID-19 hospitalizations increased from mid-June in the South and Southwest regions after stay-at-home restrictions were eased. The proportion of inpatients with COVID-19 on ventilators decreased from April to July.
Conclusions:
The NHSN hospital capacity estimates served as important, near–real-time indicators of the pandemic’s magnitude, spread, and impact, providing quantitative guidance for the public health response. Use of the estimates detected the rise of hospitalizations in specific geographic regions in June after they declined from a peak in April. Patient outcomes appeared to improve from early April to mid-July.
Optimism is associated with reduced cardiovascular disease risk; however, few prospective studies have considered optimism in relation to hypertension risk specifically. We investigated whether optimism was associated with a lower risk of developing hypertension in U.S. service members, who are more likely to develop high blood pressure early in life. We also evaluated race/ethnicity, sex and age as potential effect modifiers of these associations.
Methods
Participants were 103 486 hypertension-free U.S. Army active-duty soldiers (mean age 28.96 years, 61.76% White, 20.04% Black, 11.01% Hispanic, 4.09% Asian, and 3.10% others). We assessed optimism, sociodemographic characteristics, health conditions, health behaviours and depression status at baseline (2009–2010) via self-report and administrative records, and ascertained incident hypertension over follow-up (2010–2014) from electronic health records and health assessments. We used Cox proportional hazards regression models to estimate hazard ratios (HRs) and 95% confidence intervals (CIs), and adjusted models for a broad range of relevant covariates.
Results
Over a mean follow-up of 3.51 years, 15 052 incident hypertension cases occurred. The highest v. lowest optimism levels were associated with a 22% reduced risk of developing hypertension, after adjusting for all covariates including baseline blood pressure (HR = 0.78; 95% CI = 0.74–0.83). The difference in hypertension risk between the highest v. lowest optimism was also maintained when we excluded soldiers with hypertension in the first two years of follow-up and, separately, when we excluded soldiers with prehypertension at baseline. A dose–response relationship was evident with higher optimism associated with a lower relative risk (p < 0.001). Higher optimism was consistently associated with a lower risk of developing hypertension across sex, age and most race/ethnicity categories.
Conclusions
In a diverse cohort of initially healthy male and female service members particularly vulnerable to developing hypertension, higher optimism levels were associated with reduced hypertension risk independently of sociodemographic and health factors, a particularly notable finding given the young and healthy population. Results suggest optimism is a health asset and a potential target for public health interventions.
This 17-year prospective study applied a social-developmental lens to the challenge of distinguishing predictors of adolescent-era substance use from predictors of longer term adult substance use problems. A diverse community sample of 168 individuals was repeatedly assessed from age 13 to age 30 using test, self-, parent-, and peer-report methods. As hypothesized, substance use within adolescence was linked to a range of likely transient social and developmental factors that are particularly salient during the adolescent era, including popularity with peers, peer substance use, parent–adolescent conflict, and broader patterns of deviant behavior. Substance abuse problems at ages 27–30 were best predicted, even after accounting for levels of substance use in adolescence, by adolescent-era markers of underlying deficits, including lack of social skills and poor self-concept. The factors that best predicted levels of adolescent-era substance use were not generally predictive of adult substance abuse problems in multivariate models (either with or without accounting for baseline levels of use). Results are interpreted as suggesting that recognizing the developmental nature of adolescent-era substance use may be crucial to distinguishing factors that predict socially driven and/or relatively transient use during adolescence from factors that predict long-term problems with substance abuse that extend well into adulthood.