We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Temporal variability and methodological differences in data normalization, among other factors, complicate effective trend analysis of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) wastewater surveillance data and its alignment with coronavirus disease 2019 (COVID-19) clinical outcomes. As there is no consensus approach for these analyses yet, this study explored the use of piecewise linear trend analysis (joinpoint regression) to identify significant trends and trend turning points in SARS-CoV-2 RNA wastewater concentrations (normalized and non-normalized) and corresponding COVID-19 case rates in the greater Las Vegas metropolitan area (Nevada, USA) from mid-2020 to April 2023. The analysis period was stratified into three distinct phases based on temporal changes in testing protocols, vaccination availability, SARS-CoV-2 variant prevalence, and public health interventions. While other statistical methodologies may require fewer parameter specifications, joinpoint regression provided an interpretable framework for characterization and comparison of trends and trend turning points, revealing sewershed-specific variations in trend magnitude and timing that also aligned with known variant-driven waves. Week-level trend agreement corroborated previous findings demonstrating a close relationship between SARS-CoV-2 wastewater surveillance data and COVID-19 outcomes. These findings guide future applications of advanced statistical methodologies and support the continued integration of wastewater-based epidemiology as a complementary approach to traditional COVID-19 surveillance systems.
Several hypotheses may explain the association between substance use, posttraumatic stress disorder (PTSD), and depression. However, few studies have utilized a large multisite dataset to understand this complex relationship. Our study assessed the relationship between alcohol and cannabis use trajectories and PTSD and depression symptoms across 3 months in recently trauma-exposed civilians.
Methods
In total, 1618 (1037 female) participants provided self-report data on past 30-day alcohol and cannabis use and PTSD and depression symptoms during their emergency department (baseline) visit. We reassessed participant's substance use and clinical symptoms 2, 8, and 12 weeks posttrauma. Latent class mixture modeling determined alcohol and cannabis use trajectories in the sample. Changes in PTSD and depression symptoms were assessed across alcohol and cannabis use trajectories via a mixed-model repeated-measures analysis of variance.
Results
Three trajectory classes (low, high, increasing use) provided the best model fit for alcohol and cannabis use. The low alcohol use class exhibited lower PTSD symptoms at baseline than the high use class; the low cannabis use class exhibited lower PTSD and depression symptoms at baseline than the high and increasing use classes; these symptoms greatly increased at week 8 and declined at week 12. Participants who already use alcohol and cannabis exhibited greater PTSD and depression symptoms at baseline that increased at week 8 with a decrease in symptoms at week 12.
Conclusions
Our findings suggest that alcohol and cannabis use trajectories are associated with the intensity of posttrauma psychopathology. These findings could potentially inform the timing of therapeutic strategies.
Field studies were conducted to assess the efficacy of physical weed management of Palmer amaranth management in cucumber, peanut, and sweetpotato. Treatments were arranged in a 3 × 4 factorial in which the first factor included a treatment method of electrical, mechanical, or hand-roguing Palmer amaranth control and the second factor consisted of treatments applied when Palmer amaranth was approximately 0.3, 0.6, 0.9, or 1.2 m above the crop canopy. Four wk after treatment (WAT), the electrical applications controlled Palmer amaranth at least 27 percentage points more than the mechanical applications when applied at the 0.3- and 0.6-m timings. At the 0.9- and 1.2-m application timings 4 WAT, electrical and mechanical applications controlled Palmer amaranth by at most 87%. Though hand removal generally resulted in the greatest peanut pod count and total sweetpotato yield, mechanical and electrical control resulted in similar yield to the hand-rogued plots, depending on the treatment timing. With additional research to provide insight into the optimal applications, there is potential for electrical control and mechanical control to be used as alternatives to hand removal. Additional studies were conducted to determine the effects of electrical treatments on Palmer amaranth seed production and viability. Treatments consisted of electricity applied to Palmer amaranth at first visible inflorescence, 2 wk after first visible inflorescence (WAI) or 4 WAI. Treatments at varying reproductive maturities did not reduce the seed production immediately after treatment. However, after treatment, plants primarily died and ceased maturation, reducing seed production assessed at 4 WAI by 93% and 70% when treated at 0 and 2 WAI, respectively. Treatments did not have a negative effect on germination or seedling length.
Young people are most vulnerable to suicidal behaviours but least likely to seek help. A more elaborate study of the intrinsic and extrinsic correlates of suicidal ideation and behaviours particularly amid ongoing population-level stressors and the identification of less stigmatising markers in representative youth populations is essential.
Methods
Participants (n = 2540, aged 15–25) were consecutively recruited from an ongoing large-scale household-based epidemiological youth mental health study in Hong Kong between September 2019 and 2021. Lifetime and 12-month prevalence of suicidal ideation, plan, and attempt were assessed, alongside suicide-related rumination, hopelessness and neuroticism, personal and population-level stressors, family functioning, cognitive ability, lifetime non-suicidal self-harm, 12-month major depressive disorder (MDD), and alcohol use.
Results
The 12-month prevalence of suicidal ideation, ideation-only (no plan or attempt), plan, and attempt was 20.0, 15.4, 4.6, and 1.3%, respectively. Importantly, multivariable logistic regression findings revealed that suicide-related rumination was the only factor associated with all four suicidal outcomes (all p < 0.01). Among those with suicidal ideation (two-stage approach), intrinsic factors, including suicide-related rumination, poorer cognitive ability, and 12-month MDE, were specifically associated with suicide plan, while extrinsic factors, including coronavirus disease 2019 (COVID-19) stressors, poorer family functioning, and personal life stressors, as well as non-suicidal self-harm, were specifically associated with suicide attempt.
Conclusions
Suicide-related rumination, population-level COVID-19 stressors, and poorer family functioning may be important less-stigmatising markers for youth suicidal risks. The respective roles played by not only intrinsic but also extrinsic factors in suicide plan and attempt using a two-stage approach should be considered in future preventative intervention work.
OBJECTIVES/GOALS: #NAME? METHODS/STUDY POPULATION: Cell culture & protein identification: human T cells were purified from healthy blood, then activated & cultured for 5d. CAR-T cells were collected from infusion bags of cancer patients undergoing CAR-T. Silver staining of naive & activated healthy T-cell lysates was compared; B-II spectrin was upregulated and confirmed by Western blot. Migration assays: naive & activated T-cells were imaged during migration on ICAM-1 and ICAM-1 + CXCL12 coated plates. T-cells were transfected with BII-spectrin cDNA & the chemokine dependence of migration was compared with controls. In-vivo studies: in a melanoma mouse model, BII-spectrin transfected or control T-cells were injected; tumors were followed with serial imaging. Human patient records were examined to correlate endogenous BII-spectrin levels and CAR-T response. RESULTS/ANTICIPATED RESULTS: Activated T-cells downregulate the cytoskeletal protein B-II spectrin compared to naive cells, leading to chemokine-independent migration in in vitro assays and off-target trafficking when CAR-T cells are given in vivo. Restoration of B-II spectrin levels via transfection restores chemokine-dependence of activated T-cells. In a mouse melanoma model, control mice injected with standard activated T-cells showed fewer cells in the tumor site and more cells in the off-target organs (spleen, lungs) when compared to mice injected with B-II spectrin transfected cells. Furthermore, among 3 human patients undergoing CAR-T therapy, those with higher endogenous B-II spectrin levels experienced fewer side-effects, measured by the neurotoxicity and cytokine release syndrome grades. DISCUSSION/SIGNIFICANCE: A major hurdle to widespread CAR-T therapy for cancer is significant, often fatal side-effects. Our work shows that the protein B-II spectrin is downregulated during CAR-T production, and that restoring B-II spectrin levels decreases side-effects while increasing tumor clearance--hopefully translating to better CAR-T regimens for the future.
Racial and ethnic groups in the USA differ in the prevalence of posttraumatic stress disorder (PTSD). Recent research however has not observed consistent racial/ethnic differences in posttraumatic stress in the early aftermath of trauma, suggesting that such differences in chronic PTSD rates may be related to differences in recovery over time.
Methods
As part of the multisite, longitudinal AURORA study, we investigated racial/ethnic differences in PTSD and related outcomes within 3 months after trauma. Participants (n = 930) were recruited from emergency departments across the USA and provided periodic (2 weeks, 8 weeks, and 3 months after trauma) self-report assessments of PTSD, depression, dissociation, anxiety, and resilience. Linear models were completed to investigate racial/ethnic differences in posttraumatic dysfunction with subsequent follow-up models assessing potential effects of prior life stressors.
Results
Racial/ethnic groups did not differ in symptoms over time; however, Black participants showed reduced posttraumatic depression and anxiety symptoms overall compared to Hispanic participants and White participants. Racial/ethnic differences were not attenuated after accounting for differences in sociodemographic factors. However, racial/ethnic differences in depression and anxiety were no longer significant after accounting for greater prior trauma exposure and childhood emotional abuse in White participants.
Conclusions
The present findings suggest prior differences in previous trauma exposure partially mediate the observed racial/ethnic differences in posttraumatic depression and anxiety symptoms following a recent trauma. Our findings further demonstrate that racial/ethnic groups show similar rates of symptom recovery over time. Future work utilizing longer time-scale data is needed to elucidate potential racial/ethnic differences in long-term symptom trajectories.
Brief measurements of the subjective experience of stress with good predictive capability are important in a range of community mental health and research settings. The potential for large-scale implementation of such a measure for screening may facilitate early risk detection and intervention opportunities. Few such measures however have been developed and validated in epidemiological and longitudinal community samples. We designed a new single-item measure of the subjective level of stress (SLS-1) and tested its validity and ability to predict long-term mental health outcomes of up to 12 months through two separate studies.
Methods
We first examined the content and face validity of the SLS-1 with a panel consisting of mental health experts and laypersons. Two studies were conducted to examine its validity and predictive utility. In study 1, we tested the convergent and divergent validity as well as incremental validity of the SLS-1 in a large epidemiological sample of young people in Hong Kong (n = 1445). In study 2, in a consecutively recruited longitudinal community sample of young people (n = 258), we first performed the same procedures as in study 1 to ensure replicability of the findings. We then examined in this longitudinal sample the utility of the SLS-1 in predicting long-term depressive, anxiety and stress outcomes assessed at 3 months and 6 months (n = 182) and at 12 months (n = 84).
Results
The SLS-1 demonstrated good content and face validity. Findings from the two studies showed that SLS-1 was moderately to strongly correlated with a range of mental health outcomes, including depressive, anxiety, stress and distress symptoms. We also demonstrated its ability to explain the variance explained in symptoms beyond other known personal and psychological factors. Using the longitudinal sample in study 2, we further showed the significant predictive capability of the SLS-1 for long-term symptom outcomes for up to 12 months even when accounting for demographic characteristics.
Conclusions
The findings altogether support the validity and predictive utility of the SLS-1 as a brief measure of stress with strong indications of both concurrent and long-term mental health outcomes. Given the value of brief measures of mental health risks at a population level, the SLS-1 may have potential for use as an early screening tool to inform early preventative intervention work.
Bipolar disorder is associated with premature mortality, but evidence is mostly derived from Western countries. There has been no research evaluating shortened lifespan in bipolar disorder using life-years lost (LYLs), which is a recently developed mortality metric taking into account illness onset for life expectancy estimation. The current study aimed to examine the extent of premature mortality in bipolar disorder patients relative to the general population in Hong Kong (HK) in terms of standardised mortality ratio (SMR) and excess LYLs, and changes of mortality rate over time.
Methods
This population-based cohort study investigated excess mortality in 12 556 bipolar disorder patients between 2008 and 2018, by estimating all-cause and cause-specific SMRs, and LYLs. Trends in annual SMRs over the 11-year study period were assessed. Study data were retrieved from a territory-wide medical-record database of HK public healthcare services.
Results
Patients had higher all-cause [SMR: 2.60 (95% CI: 2.45–2.76)], natural-cause [SMR: 1.90 (95% CI: 1.76–2.05)] and unnatural-cause [SMR: 8.63 (95% CI: 7.34–10.03)] mortality rates than the general population. Respiratory diseases, cardiovascular diseases and cancers accounted for the majority of deaths. Men and women with bipolar disorder had 6.78 (95% CI: 6.00–7.84) years and 7.35 (95% CI: 6.75–8.06) years of excess LYLs, respectively. The overall mortality gap remained similar over time, albeit slightly improved in men with bipolar disorder.
Conclusions
Bipolar disorder is associated with increased premature mortality and substantially reduced lifespan in a predominantly Chinese population, with excess deaths mainly attributed to natural causes. Persistent mortality gap underscores an urgent need for targeted interventions to improve physical health of patients with bipolar disorder.
Hand hygiene compliance decreased significantly when opportunities exceeded 30 per hour. At higher workloads, the number of healthcare worker types involved and the proportion of hand hygiene opportunities for which physicians and other healthcare workers were responsible increased. Thus, care complexity and risk to patients may both increase with workload.
To determine whether the order in which healthcare workers perform patient care tasks affects hand hygiene compliance.
Design:
For this retrospective analysis of data collected during the Strategies to Reduce Transmission of Antimicrobial Resistant Bacteria in Intensive Care Units (STAR*ICU) study, we linked consecutive tasks healthcare workers performed into care sequences and identified task transitions: 2 consecutive task sequences and the intervening hand hygiene opportunity. We compared hand hygiene compliance rates and used multiple logistic regression to determine the adjusted odds for healthcare workers (HCWs) transitioning in a direction that increased or decreased the risk to patients if healthcare workers did not perform hand hygiene before the task and for HCWs contaminating their hands.
Setting:
The study was conducted in 17 adult surgical, medical, and medical-surgical intensive care units.
Participants:
HCWs in the STAR*ICU study units.
Results:
HCWs moved from cleaner to dirtier tasks during 5,303 transitions (34.7%) and from dirtier to cleaner tasks during 10,000 transitions (65.4%). Physicians (odds ratio [OR]: 1.50; P < .0001) and other HCWs (OR, 2.15; P < .0001) were more likely than nurses to move from dirtier to cleaner tasks. Glove use was associated with moving from dirtier to cleaner tasks (OR, 1.22; P < .0001). Hand hygiene compliance was lower when HCWs transitioned from dirtier to cleaner tasks than when they transitioned in the opposite direction (adjusted OR, 0.93; P < .0001).
Conclusions:
HCWs did not organize patient care tasks in a manner that decreased risk to patients, and they were less likely to perform hand hygiene when transitioning from dirtier to cleaner tasks than the reverse. These practices could increase the risk of transmission or infection.
Understanding the development of specific components of the neonatal immune system is critical to the understanding of the susceptibility of the neonate to specific pathogens [1]. With the increasing survival of extremely premature infants, neonatologists and other physicians caring for these newborns need to be aware of the vulnerability of this population. Furthermore, it is important for neonatologists to be able to differentiate between immune immaturity and the manifestations of a true primary immunodeficiency that present during the neonatal period. Failure to properly identify primary or acquired immunodeficiency diseases can result in delayed diagnosis and treatment, adversely affecting outcomes. This chapter will briefly define the immune immaturity of the neonate and a diagnostic approach for primary immune deficiency diseases that may present in the neonatal period.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Abnormal effort-based decision-making represents a potential mechanism underlying motivational deficits (amotivation) in psychotic disorders. Previous research identified effort allocation impairment in chronic schizophrenia and focused mostly on physical effort modality. No study has investigated cognitive effort allocation in first-episode psychosis (FEP).
Method
Cognitive effort allocation was examined in 40 FEP patients and 44 demographically-matched healthy controls, using Cognitive Effort-Discounting (COGED) paradigm which quantified participants’ willingness to expend cognitive effort in terms of explicit, continuous discounting of monetary rewards based on parametrically-varied cognitive demands (levels N of N-back task). Relationship between reward-discounting and amotivation was investigated. Group differences in reward-magnitude and effort-cost sensitivity, and differential associations of these sensitivity indices with amotivation were explored.
Results
Patients displayed significantly greater reward-discounting than controls. In particular, such discounting was most pronounced in patients with high levels of amotivation even when N-back performance and reward base amount were taken into consideration. Moreover, patients exhibited reduced reward-benefit sensitivity and effort-cost sensitivity relative to controls, and that decreased sensitivity to reward-benefit but not effort-cost was correlated with diminished motivation. Reward-discounting and sensitivity indices were generally unrelated to other symptom dimensions, antipsychotic dose and cognitive deficits.
Conclusion
This study provides the first evidence of cognitive effort-based decision-making impairment in FEP, and indicates that decreased effort expenditure is associated with amotivation. Our findings further suggest that abnormal effort allocation and amotivation might primarily be related to blunted reward valuation. Prospective research is required to clarify the utility of effort-based measures in predicting amotivation and functional outcome in FEP.
Better understanding of interplay among symptoms, cognition and functioning in first-episode psychosis (FEP) is crucial to promoting functional recovery. Network analysis is a promising data-driven approach to elucidating complex interactions among psychopathological variables in psychosis, but has not been applied in FEP.
Method
This study employed network analysis to examine inter-relationships among a wide array of variables encompassing psychopathology, premorbid and onset characteristics, cognition, subjective quality-of-life and psychosocial functioning in 323 adult FEP patients in Hong Kong. Graphical Least Absolute Shrinkage and Selection Operator (LASSO) combined with extended Bayesian information criterion (BIC) model selection was used for network construction. Importance of individual nodes in a generated network was quantified by centrality analyses.
Results
Our results showed that amotivation played the most central role and had the strongest associations with other variables in the network, as indexed by node strength. Amotivation and diminished expression displayed differential relationships with other nodes, supporting the validity of two-factor negative symptom structure. Psychosocial functioning was most strongly connected with amotivation and was weakly linked to several other variables. Within cognitive domain, digit span demonstrated the highest centrality and was connected with most of the other cognitive variables. Exploratory analysis revealed no significant gender differences in network structure and global strength.
Conclusion
Our results suggest the pivotal role of amotivation in psychopathology network of FEP and indicate its critical association with psychosocial functioning. Further research is required to verify the clinical significance of diminished motivation on functional outcome in the early course of psychotic illness.
The second Singapore Mental Health Study (SMHS) – a nationwide, cross-sectional, epidemiological survey - was initiated in 2016 with the intent of tracking the state of mental health of the general population in Singapore. The study employed the same methodology as the first survey initiated in 2010. The SMHS 2016 aimed to (i) establish the 12-month and lifetime prevalence and correlates of major depressive disorder (MDD), dysthymia, bipolar disorder, generalised anxiety disorder (GAD), obsessive compulsive disorder (OCD) and alcohol use disorder (AUD) (which included alcohol abuse and dependence) and (ii) compare the prevalence of these disorders with reference to data from the SMHS 2010.
Methods
Door-to-door household surveys were conducted with adult Singapore residents aged 18 years and above from 2016 to 2018 (n = 6126) which yielded a response rate of 69.0%. The subjects were randomly selected using a disproportionate stratified sampling method and assessed using World Health Organization Composite International Diagnostic Interview version 3.0 (WHO-CIDI 3.0). The diagnoses of lifetime and 12-month selected mental disorders including MDD, dysthymia, bipolar disorder, GAD, OCD, and AUD (alcohol abuse and alcohol dependence), were based on the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) criteria.
Results
The lifetime prevalence of at least one mood, anxiety or alcohol use disorder was 13.9% in the adult population. MDD had the highest lifetime prevalence (6.3%) followed by alcohol abuse (4.1%). The 12-month prevalence of any DSM-IV mental disorders was 6.5%. OCD had the highest 12-month prevalence (2.9%) followed by MDD (2.3%). Lifetime and 12-month prevalence of mental disorders assessed in SMHS 2016 (13.8% and 6.4%) was significantly higher than that in SMHS 2010 (12.0% and 4.4%). A significant increase was observed in the prevalence of lifetime GAD (0.9% to 1.6%) and alcohol abuse (3.1% to 4.1%). The 12-month prevalence of GAD (0.8% vs. 0.4%) and OCD (2.9% vs. 1.1%) was significantly higher in SMHS 2016 as compared to SMHS 2010.
Conclusions
The high prevalence of OCD and the increase across the two surveys needs to be tackled at a population level both in terms of creating awareness of the disorder and the need for early treatment. Youth emerge as a vulnerable group who are more likely to be associated with mental disorders and thus targeted interventions in this group with a focus on youth friendly and accessible care centres may lead to earlier detection and treatment of mental disorders.
Plant nitrogen (N) links with many physiological progresses of crop growth and yield formation. Accurate simulation is key to predict crop growth and yield correctly. The aim of the current study was to improve the estimation of N uptake and translocation processes in the whole rice plant as well as within plant organs in the RiceGrow model by using plant and organ maximum, critical and minimum N dilution curves. The maximum and critical N (Nc) demand (obtained from the maximum and critical curves) of shoot and root and Nc demand of organs (leaf, stem and panicle) are calculated by N concentration and biomass. Nitrogen distribution among organs is computed differently pre- and post-anthesis. Pre-anthesis distribution is determined by maximum N demand with no priority among organs. In post-anthesis distribution, panicle demands are met first and then the remaining N is allocated to other organs without priority. The amount of plant N uptake depends on plant N demand and N supplied by the soil. Calibration and validation of the established model were performed on field experiments conducted in China and the Philippines with varied N rates and N split applications; results showed that this improved model can simulate the processes of N uptake and translocation well.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
Knowledge, attitudes and practices (KAP) of the population regarding severe fever with thrombocytopenia syndrome (SFTS) in endemic areas of Lu'an in China were assessed before and after an intervention programme. The pre-intervention phase was conducted using a sample of 425 participants from the 12 selected villages with the highest rates of endemic SFTS infection. A predesigned interview questionnaire was used to assess KAP. Subsequently, an intervention programme was designed and applied in the selected villages. KAP was re-assessed for each population in the selected villages using the same interview questionnaire. Following 2 months of the programme, 339 participants had completed the re-assessed survey. The impact of the intervention programme was evaluated using suitable statistical methods. A significant increase in the KAP and total KAP scores was noted following the intervention programme, whereas the proportion of correct knowledge, the positive attitudes and the effective practices toward SFTS of respondents increased significantly. The intervention programme was effective in improving KAP level of SFTS in populations that were resident in endemic areas.
Following the success of static analysis of free-free 2-D plane trusses by using a self-regularization approach uniquely, we further extend the technique to deal with 3-D problems of space trusses. The inherent singular stiffness of a free-free structure is expanded to a bordered matrix by adding r singular vectors corresponding to zero singular values, where r is the nullity of the singular stiffness matrix. Besides, r constraints are accompanied to result in a nonsingular matrix. Only the pure particular solution with nontrivial strain is then obtained but without the homogeneous solution of no deformation. To link with the Fredholm alternative theorem, the slack variables with zero values indicate the infinite solutions while those with nonzero values imply the case of no solutions. A simple space truss is used to demonstrate the validity of the proposed model. An alternative way of reasonable support system to result in a nonsingular stiffness matrix is also addressed. In addition, the finite-element commercial code ABAQUS is also implemented to check the results.
Macdunnoughia crassisigna Warren (Lepidoptera: Noctuidae) is a highly destructive herbivore that poses a serious risk to cotton, maize, soybean, and cruciferous vegetables in East Asia. Examining the effects of various biotic and abiotic factors on the flight performance of M. crassisigna is crucial for a better understanding of its trans-regional migration. In this study, the flight activity of M. crassisignai moths of different ages, under different temperatures and relative humidity (RH) levels, was evaluated by tethering individuals to computerized flight mills for a 24-h trial period. The results showed that M. crassisignai had the capacity for sustained flight and the flight ability was strongest in 3-day-old individuals, and then their flight performance decreased significantly in older moths. For both sexes, temperature had a significant effect on their flight performance, and the flight activity was relatively higher at 24–28°C than other temperatures. There was a significant effect of RH on all flight parameters of the tested moths, and the flight activity was relatively higher at RH of 60–75% than other RH levels. For 3-day-old moths under the optimum conditions (24°C and 75% RH) throughout the 24 h scotophase, their mean flight distance reached 66 km, and the mean flight duration reached 13.5 h, suggesting M. crassisigna possess strong potential to undertake long-distance migration. These findings will be helpful for developing sound forecasting systems of this pest species.