We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
It is important for the research produced by industrial-organizational (I-O) psychologists to be rigorous, relevant, and useful to organizations. However, I-O psychology research is often not used in practice. In this paper, we (both practitioners and academics) argue that engaged scholarship—a particular method of inclusive, collaborative research that incorporates multiple stakeholder perspectives throughout the research process—can help reduce this academic–practice gap and advance the impact of I-O psychology. To examine the current state of the field, we reviewed empirical evidence of the current prevalence of collaborative research by examining the number of articles that contain nonacademic authors across 14 key I-O psychology journals from 2018 to 2023. We then build on these findings by describing how engaged scholarship can be integrated throughout the research process and conclude with a call to action for I-O psychologists to conduct more collaborative research. Overall, our goal is to facilitate a fruitful conversation about the value of collaborative research that incorporates multiple stakeholder perspectives throughout the research process in hopes of reducing the academic–practice gap. We also aim to inspire action in the field to maintain and enhance the impact of I-O psychology on the future world of work.
Investigate the relationship of chronic neurobehavioral and cognitive symptoms in military personnel with history of blast-related mild TBI and compare to a well-matched group of combat-deployed controls.
Participants and Methods:
274 participants (mean age=34 years; mean education=14.75 years; 91.2% male) enrolled in the EVOLVE longitudinal study of combat-deployed military personnel were subdivided into those with history of blast TBI (n=165) and controls without history of blast exposure and TBI (n=109). As part of a larger study, we conducted a sub-analysis of 5-year follow up data. We focused on group differences (Mann-Whitney U) and correlational relationships between self-report neurobehavioral symptoms via the Frontal Systems Behavior Scale (FrSBE) and cognitive performances on measures of attention, working memory, processing speed, and executive functioning including D-KEFS Color Word Interference (CWI), Trailmaking A and B, and the Conners Continuous Performance Test (CPT).
Results:
The Blast TBI group reported higher levels of neurobehavioral symptoms on the FrSBE (p<.001), including domains of apathy (p<.001), disinhibition (p<.001), and executive dysfunction (p<.001), compared to Controls. On cognitive measures, group differences were observed on CWI Inhibition/Switching (p=.008), Trails B time (p=.010), and CPT commission errors (p=.014), such that the Blast TBI group performed worse than Controls. No significant group differences were observed for CPT omission errors or CPT hit rate (p’s>.05). After adjustment for multiple comparisons, greater FrSBE apathy correlated with slower performance on Trails A for Blast TBI (r=0.22, p=.014) but not Controls. Apathy endorsement was not significantly related to CPT omission errors for either group (p’s>.05). Higher endorsement of disinhibition symptoms was associated with worse performance on CWI Inhibition (Blast TBI: r=-0.19, p=.036; Controls: r=-0.28, p=.012) and Inhibition/Switching (Blast TBI: r=-0.23, p=.010; Controls: r=-0.29, p=.010) conditions for both groups, whereas only the Blast TBI group showed significant relationships between disinhibition symptoms and Trails B-A time (r=0.20, p=.025) and CPT commission errors (r=.18, p=.038). Higher endorsement of executive dysfunction correlated with poorer performance for Trails B-A for both groups (Blast TBI: r=.24, p=.009; Controls: r=.24, p=.030).
Conclusions:
Our findings reveal that at 5-year follow up, military personnel with history of blast-related mild TBI reported significantly greater neurobehavioral symptoms and performed lower on standardized measures of executive functioning, relative to combat-deployed controls without TBI or blast exposure. Significant relationships between neurobehavioral symptoms and cognitive performance were present in both groups. However, these relationships were more pronounced in the Blast TBI group, including greater apathy associated with slower visual tracking as well as greater endorsement of disinhibition associated with set-switching. Objective measures of response inhibition were related to disinhibition endorsement for both groups, though impulsive errors were more pronounced for the Blast TBI group. Our results suggest chronic cognitive and neurobehavioral symptoms are present in military personnel with history of blast TBI exposure, and also discrepant from a well-matched control group of combat deployed military personnel. Future studies of this population should explore models to predict cognitive performance from neurobehavioral symptoms in military personnel, as this could inform treatment approaches for those at greatest risk of cognitive change.
This study aimed to analyse the temporal and spatial trends in the burden of anxiety disorders and major depressive disorder related to bullying victimisation on global, regional and country scales.
Methods
Data were from the 2019 Global Burden of Disease (GBD) Study. We assessed the global disability-adjusted life years (DALYs, per 100 000 population) of anxiety disorders and major depressive disorder attributable to bullying victimisation by age, sex and geographical location. The percentage changes in age-standardised rates of DALYs were used to quantify temporal trends, and the annual rate changes across 204 countries and territories were used to present spatial trends. Furthermore, we examined the relationship between the sociodemographic index (SDI) and the burden of anxiety disorders as well as major depressive disorder attributable to bullying victimisation and its spatial and temporal characteristics globally.
Results
From 1990 to 2019, the global DALY rates of anxiety disorders and major depressive disorder attributable to bullying victimisation increased by 23.31 and 26.60%, respectively, with 27.27 and 29.07% for females and 18.88 and 23.84% for males. Across the 21 GBD regions, the highest age-standardised rates of bullying victimisation-related DALYs for anxiety disorders were in North Africa and the Middle East and for major depressive disorder in High-income North America. From 1990 to 2019, the region with the largest percentage increase in the rates of DALYs was High-income North America (54.66% for anxiety disorders and 105.88% for major depressive disorder), whereas the region with the slowest growth rate or largest percentage decline was East Asia (1.71% for anxiety disorders and −25.37% for major depressive disorder). In terms of SDI, this study found overall upward trends of bullying-related mental disorders in areas regardless of the SDI levels, although there were temporary downward trends in some stages of certain areas.
Conclusions
The number and rates of DALYs of anxiety disorders and major depressive disorder attributable to bullying victimisation increased from 1990 to 2019. Effective strategies to eliminate bullying victimisation in children and adolescents are needed to reduce the burden of anxiety disorders and major depressive disorder. Considering the large variations in the burden by SDI and geographic location, future protective actions should be developed based on the specific cultural contexts, development status and regional characteristics of each country.
Among 287 US hospitals reporting data between 2015 and 2018, annual pediatric surgical site infection (SSI) rates ranged from 0% for gallbladder to 10.4% for colon surgeries. Colon, spinal fusion, and small-bowel SSI rates did not decrease with greater surgical volumes in contrast to appendix and ventricular-shunt SSI rates.
The availability of large healthcare datasets offers the opportunity for researchers to navigate the traditional clinical and translational science research stages in a nonlinear manner. In particular, data scientists can harness the power of large healthcare datasets to bridge from preclinical discoveries (T0) directly to assessing population-level health impact (T4). A successful bridge from T0 to T4 does not bypass the other stages entirely; rather, effective team science makes a direct progression from T0 to T4 impactful by incorporating the perspectives of researchers from every stage of the clinical and translational science research spectrum. In this exemplar, we demonstrate how effective team science overcame challenges and, ultimately, ensured success when a diverse team of researchers worked together, using healthcare big data to test population-level substance use disorder (SUD) hypotheses generated from preclinical rodent studies. This project, called Advancing Substance use disorder Knowledge using Big Data (ASK Big Data), highlights the critical roles that data science expertise and effective team science play in quickly translating preclinical research into public health impact.
Developing alternatives to antibiotics is an urgent need in livestock production. Antimicrobial peptides (AMPs) are regarded as powerful antibiotic substitutes (ASs) because AMPs have broad-spectrum antimicrobial activities and growth-promoting ability. Here, we aimed to comprehensively assess the effects of AMPs on the growth performance, diarrhea rate, intestinal morphology and immunity of healthy or challenged piglets, compared with an antibiotics group or negative control group. We performed a set of meta-analyses of feeding trials from database inception to 27 May 2019. Among the 1379 identified studies, 20 were included in our meta-analyses (56 arms and 4067 piglets). The meta-analyses revealed that (1) compared with the negative control group, AMPs significantly improved the healthy piglets’ average daily gain (ADG), average daily feed intake (ADFI), gain : feed ratio (G/F), levels of immune globulin (Ig) IgM and IgG, and intestinal villus height : crypt depth ratio (V/C) (P < 0.05). Meanwhile, AMPs significantly increased the challenged piglets’ ADG, ADFI, G/F and V/C of the jejunum and ileum, and notably deceased the diarrhea rate (P < 0.05); (2) compared with antibiotics group, the effects of AMPs were slightly weaker than those of antibiotics in the healthy piglets, but AMPs have similar effects to those of antibiotics in challenged piglets. In a higher purity, the optimal dose of AMPs may be approximately 0.01%. Our findings indicate that AMPs can improve piglet growth performance, enhance immunity, benefit intestinal morphology and decrease the diarrheal rate. AMPs could be great ASs especially under infection conditions.
Implementation of genome-scale sequencing in clinical care has significant challenges: the technology is highly dimensional with many kinds of potential results, results interpretation and delivery require expertise and coordination across multiple medical specialties, clinical utility may be uncertain, and there may be broader familial or societal implications beyond the individual participant. Transdisciplinary consortia and collaborative team science are well poised to address these challenges. However, understanding the complex web of organizational, institutional, physical, environmental, technologic, and other political and societal factors that influence the effectiveness of consortia is understudied. We describe our experience working in the Clinical Sequencing Evidence-Generating Research (CSER) consortium, a multi-institutional translational genomics consortium.
Methods:
A key aspect of the CSER consortium was the juxtaposition of site-specific measures with the need to identify consensus measures related to clinical utility and to create a core set of harmonized measures. During this harmonization process, we sought to minimize participant burden, accommodate project-specific choices, and use validated measures that allow data sharing.
Results:
Identifying platforms to ensure swift communication between teams and management of materials and data were essential to our harmonization efforts. Funding agencies can help consortia by clarifying key study design elements across projects during the proposal preparation phase and by providing a framework for data sharing data across participating projects.
Conclusions:
In summary, time and resources must be devoted to developing and implementing collaborative practices as preparatory work at the beginning of project timelines to improve the effectiveness of research consortia.
Mindfulness based art therapy induces emotional relaxation in cancer patients and is a treatment known to improve psychological stability. The objective of this research was to evaluate the treatment effects of MBAT for breast cancer patients.
Methods
Twenty-four breast cancer patients were selected as subjects of the study. Two groups, the MBAT group and control group with 12 patients each, were randomly assigned. The patients in the MBAT group were given 12 sessions of treatments. To measure depression and anxiety, low scales of the Personality Assessment Inventory (PAI) was used. Health-related quality of life was evaluated using the European organization for research and treatment of cancer quality of life questionnaire (EORTC-QLQ-C30). The treatment results were analyzed using Ancova and two-way repeated measures Anova.
Results
The results showed that depression and anxiety decreased significantly and health-related quality of life improved significantly in the MBAT group. In the control group, however, there was no significant change.
Conclusions
MBAT can be seen as an effective treatment method that improves breast cancer patients’ psychological stability and quality of life. Evaluation of treatment effects using program development and large-scale research for future clinical application is needed.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The search for life in the Universe is a fundamental problem of astrobiology and modern science. The current progress in the detection of terrestrial-type exoplanets has opened a new avenue in the characterization of exoplanetary atmospheres and in the search for biosignatures of life with the upcoming ground-based and space missions. To specify the conditions favourable for the origin, development and sustainment of life as we know it in other worlds, we need to understand the nature of global (astrospheric), and local (atmospheric and surface) environments of exoplanets in the habitable zones (HZs) around G-K-M dwarf stars including our young Sun. Global environment is formed by propagated disturbances from the planet-hosting stars in the form of stellar flares, coronal mass ejections, energetic particles and winds collectively known as astrospheric space weather. Its characterization will help in understanding how an exoplanetary ecosystem interacts with its host star, as well as in the specification of the physical, chemical and biochemical conditions that can create favourable and/or detrimental conditions for planetary climate and habitability along with evolution of planetary internal dynamics over geological timescales. A key linkage of (astro)physical, chemical and geological processes can only be understood in the framework of interdisciplinary studies with the incorporation of progress in heliophysics, astrophysics, planetary and Earth sciences. The assessment of the impacts of host stars on the climate and habitability of terrestrial (exo)planets will significantly expand the current definition of the HZ to the biogenic zone and provide new observational strategies for searching for signatures of life. The major goal of this paper is to describe and discuss the current status and recent progress in this interdisciplinary field in light of presentations and discussions during the NASA Nexus for Exoplanetary System Science funded workshop ‘Exoplanetary Space Weather, Climate and Habitability’ and to provide a new roadmap for the future development of the emerging field of exoplanetary science and astrobiology.
Solar coronal dimmings have been observed extensively in the past two decades and are believed to have close association with coronal mass ejections (CMEs). Recent study found that coronal dimming is the only signature that could differentiate powerful flares that have CMEs from those that do not. Therefore, dimming might be one of the best candidates to observe the stellar CMEs on distant Sun-like stars. In this study, we investigate the possibility of using coronal dimming as a proxy to diagnose stellar CMEs. By simulating a realistic solar CME event and corresponding coronal dimming using a global magnetohydrodynamics model (AWSoM: Alfvén-wave Solar Model), we first demonstrate the capability of the model to reproduce solar observations. We then extend the model for simulating stellar CMEs by modifying the input magnetic flux density as well as the initial magnetic energy of the CME flux rope. Our result suggests that with improved instrument sensitivity, it is possible to detect the coronal dimming signals induced by the stellar CMEs.
Rabies is one of the major public health problems in China, and the mortality rate of rabies remains the highest among all notifiable infectious diseases. A meta-analysis was conducted to investigate the post-exposure prophylaxis (PEP) vaccination rate and risk factors for human rabies in mainland China. The PubMed, Web of Science, Chinese National Knowledge Infrastructure, Chinese Science and Technology Periodical and Wanfang databases were searched for articles on rabies vaccination status (published between 2007 and 2017). In total, 10 174 human rabies cases from 136 studies were included in this meta-analysis. Approximately 97.2% (95% confidence interval (CI) 95.1–98.7%) of rabies cases occurred in rural areas and 72.6% (95% CI 70.0–75.1%) occurred in farmers. Overall, the vaccination rate in the reported human rabies cases was 15.4% (95% CI 13.7–17.4%). However, among vaccinated individuals, 85.5% (95% CI 79.8%–83.4%) did not complete the vaccination regimen. In a subgroup analysis, the PEP vaccination rate in the eastern region (18.8%, 95% CI 15.9–22.1%) was higher than that in the western region (13.3%, 95% CI 11.1–15.8%) and this rate decreased after 2007. Approximately 68.9% (95% CI 63.6–73.8%) of rabies cases experienced category-III exposures, but their PEP vaccination rate was 27.0% (95% CI 14.4–44.9%) and only 6.1% (95% CI 4.4–8.4%) received rabies immunoglobulin. Together, these results suggested that the PEP vaccination rate among human rabies cases was low in mainland China. Therefore, standardised treatment and vaccination programs of dog bites need to be further strengthened, particularly in rural areas.
PALWEED:WHEAT is a bioeconomic decision model for determining profit-maximizing postemergence herbicide treatments for winter wheat in the Washington–Idaho Palouse region. PALWEED:WHEAT performed relatively well economically in 2 yr of on-farm field tests. However, the model was less sensitive than desired in prescribing postemergence broadleaved herbicides in the presence of high densities of broadleaved weed seedlings. Therefore, PALWEED:WHEAT was revised in response to the field testing. This paper compares the revised model's agronomic and economic performance to the original model in computer simulations. The revised model, PALWEED:WHEAT II, differs from the original model in several respects: (1) exponential functions replace linear functions in predicting weed survival, (2) preplant application of a nonselective herbicide is entered as an exogenous binary variable, (3) separate indices of broadleaved and grass competition are substituted for an aggregate weed competition index in the wheat yield function, (4) hyperbolic replaces logistic functional representation of weed damage to wheat yield, and (5) separate models are estimated for winter wheat after spring dry pea and for winter wheat in all examined crop rotation positions. In simulations including a variety of agronomic and economic conditions, PALWEED:WHEAT II recommended postemergence herbicide types and rates that consistently complied with agronomic and economic theory. Furthermore, the revised model, especially when estimated from the relevant wheat after pea data set, was markedly more balanced in recommending both broadleaved and grass herbicides in response to observed densities of both weed groups. The substantial change in herbicide recommendations in response to changes in model functional specifications following field testing confirms the importance of field testing and revision of bioeconomic decision models.
Based on six years of data from a field experiment near Pullman, WA, a bioeconomic decision model was developed to annually estimate the optimal post-emergence herbicide types and rates to control multiple weed species in winter wheat under various tillage systems and crop rotations. The model name, PALWEED:WHEAT, signifies a Washington-Idaho Palouse region weed management model for winter wheat The model consists of linear preharvest weed density functions, a nonlinear yield response function, and a profit function. Preharvest weed density functions were estimated for four weed groups: summer annual grasses, winter annual grasses, summer annual broadleaves, and winter annual broadleaves. A single aggregated weed competition index was developed from the four density functions for use functions for use in the yield model. A yield model containing a logistic damage function performed better than a model containing a rectangular hyperbolic damage function. Herbicides were grouped into three categories: preplant nonselective, postemergence broadleaf, and postemergence grass. PALWEED:WHEAT was applied to average conditions of the 6-yr experiment to predict herbicide treatments that maximized profit. In comparison to average treatment rates in the 6-yr experiment, the bioeconomic decision model recommended less postemergence herbicide. The weed management recommendations of PALWEED:WHEAT behaved as expected by agronomic and economic theory in response to changes in assumed weed populations, herbicide costs, crop prices, and possible restrictions on herbicide application rates.
Climate change has greatly affected agricultural production, and will lead to further changes in cropping system, varietal type and cultivation techniques for each region. The potential effects of climate change on rice production in Fujian Province, China, were explored in the current study with CERES-Rice model and climate-change scenarios, based on the self-adaptation of rice production. The results indicated that simulated yields of early rice in the double-rice region in south-eastern Fujian under scenarios A2, B2 and A1B increased by 15·9, 18·0 and 19·2%, respectively, and correspondingly those of late rice increased by 9·2, 7·4 and 7·4% when self-adaptation adjustment was considered, compared to scenarios without that consideration. In the double-rice region in north-western Fujian, simulated yields of early rice increased by 21·2, 20·5 and 18·9% and those of late rice by 14·7, 14·8 and 7·2% under scenarios A2, B2 and A1B, respectively, when self-adaptation was considered, compared to without consideration. Similar results were obtained for the single-rice region in the mountain areas of north-western Fujian, correspondingly increasing by 4·9, 5·0 and 2·9% when self-adaptation was considered compared to when it was not. In this single-rice region, double rice might be grown in the future at the Changting site under scenarios A1 and B2. When the self-adaptation adjustment was considered, the simulated overall output of rice crops in Fujian under scenarios A2, B2 and A1B increased by 5·9, 5·2 and 5·1%, respectively. Thus, more optimistic results were obtained when the self-adaptation ability of rice production was considered.
Clomazone has been successfully used for weed control in rice, but crop injury is a potential problem on light-textured soils. Experiments were conducted to determine the effect of soil characteristics and water potential on plant-available clomazone and rice injury. A centrifugal double-tube technique was used to determine plant-available concentration in soil solution (ACSS), total amount available in soil solution (TASS), and Kd values for clomazone on four soils at four water potentials. A rice bioassay was conducted parallel to the plant-available study to correlate biological availability to ACSS, TASS, and Kd. TASS was significantly different in all soils. The order of increasing TASS for the soils studied was Morey < Edna < Nada < Crowley, which correlated well with soil characteristics. The order of increasing TASS after equilibrium was − 90 < − 75 < − 33 < 0 kPa. TASS values at 0 kPa were greater than two times the TASS values at − 90 kPa. It appears that severe rice injury from clomazone on these soils could occur if TASS > 110 ng g−1 and Kd < 1.1 ml g−1. We propose that the double-tube technique provides a more accurate estimate of available herbicide because the solution–soil ratios are < 0.33:1 and would be more representative of a plant root–herbicide relationship. This technique or some variation possibly could be further developed such that clomazone rates could be more clearly defined particularly on lighter-textured soils. TASS may be a better predictor of plant-available herbicide than ACSS when evaluating moderately to highly water-soluble herbicides in a nonsaturated soil environment.
Rural-to-urban migrant workers are a large marginalised population in urban China. Prevalence estimates of common mental health problems (CMHPs) in previous studies varied widely and very few studies have investigated migration-related factors of CMHPs in migrant workers. The objective of this study was to determine the prevalence and risk factors of CMHPs among Chinese migrant workers.
Methods.
A random sample of 3031 migrant workers of ten manufacturing factories in Shenzhen, China, completed a standardised questionnaire containing socio-demographic and migration-related variables and the Chinese 12-item General Health Questionnaire (GHQ-12). A GHQ-12 score of three or higher was used to denote the presence of CMHPs.
Results.
The prevalence of CMHPs was 34.4% in Chinese migrant workers. In multiple logistic regression, risk factors for CMHPs included being 16–25 years old (odd ratio [OR] 1.65, 95% confidence interval [CI] 1.28, 2.12), being 26–35 years old (OR 1.36, 95% CI: 1.05, 1.75), low monthly income (OR 1.42, 95% CI 1.04, 1.92), poor living condition (OR: 1.76, 95% CI: 1.22, 2.54), physical illness in the past 2 weeks (OR 1.72, 95% CI 1.43, 2.05), having worked in many cities (OR 1.34, 95% CI 1.03, 1.74), infrequently visiting hometown (OR 1.56, 95% CI 1.22, 1.99), poor Mandarin proficiency (OR 1.51, 95%CI 1.13, 2.01), a low level of perceived benefits of migration (OR 1.33, 95% CI 1.14, 1.55) and working more than 8 h/day (OR 1.39, 95% CI 1.14, 1.70).
Conclusions.
CMHPs are very prevalent among Chinese migrant workers. Given the large number of Chinese migrant workers, there is an urgent need to address the mental health burden of China's migrant worker population.
We determined the prevalence and seasonality of infections by Fasciola of goats and bovine species (cattle and water buffalo) in Hubei and Anhui provinces of China. Faecal samples were collected at 2- to 3-month intervals from 200 goats in Hubei province and from 152 bovine species in Anhui province. All faecal samples were examined for the presence of parasites. We determined the nucleotide sequences of the first and second internal transcribed spacers (ITS-1 and ITS-2) of the nuclear ribosomal DNA (rDNA) of 39 Fasciola worms from Anhui province. The prevalence of Fasciola infection in goats ranged between 3.5 and 37.0%, with mean eggs per gram (EPG) ranging between 29.0 and 166.0. Prevalence and EPG exhibited downward trends over time with significant differences. The prevalence of Fasciola infection in cattle ranged between 13.3 and 46.2% (mean EPG, 36.4–100.0), and that of water buffalo ranged between 10.3 and 35.4% (mean EPG, 25.0–89.6), with a higher prevalence of infection and EPG from June to October compared with December to March. Analysis of ITS-1 and ITS-2 sequences revealed that F. hepatica and F. gigantica were present in all bovine species of Anhui province and that F. gigantica mainly infected water buffalo. This is the first demonstration of Fasciola infection in Hubei province and detection of F. hepatica and F. gigantica in Anhui province. The present study of Hubei province shows that mass treatment of livestock with closantel sodium injections in April and August/September controlled Fasciola infection effectively.