We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Accelerating COVID-19 Treatment Interventions and Vaccines (ACTIV) was initiated by the US government to rapidly develop and test vaccines and therapeutics against COVID-19 in 2020. The ACTIV Therapeutics-Clinical Working Group selected ACTIV trial teams and clinical networks to expeditiously develop and launch master protocols based on therapeutic targets and patient populations. The suite of clinical trials was designed to collectively inform therapeutic care for COVID-19 outpatient, inpatient, and intensive care populations globally. In this report, we highlight challenges, strategies, and solutions around clinical protocol development and regulatory approval to document our experience and propose plans for future similar healthcare emergencies.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Homeless shelter residents and staff may be at higher risk of SARS-CoV-2 infection. However, SARS-CoV-2 infection estimates in this population have been reliant on cross-sectional or outbreak investigation data. We conducted routine surveillance and outbreak testing in 23 homeless shelters in King County, Washington, to estimate the occurrence of laboratory-confirmed SARS-CoV-2 infection and risk factors during 1 January 2020–31 May 2021. Symptom surveys and nasal swabs were collected for SARS-CoV-2 testing by RT-PCR for residents aged ≥3 months and staff. We collected 12,915 specimens from 2,930 unique participants. We identified 4.74 (95% CI 4.00–5.58) SARS-CoV-2 infections per 100 individuals (residents: 4.96, 95% CI 4.12–5.91; staff: 3.86, 95% CI 2.43–5.79). Most infections were asymptomatic at the time of detection (74%) and detected during routine surveillance (73%). Outbreak testing yielded higher test positivity than routine surveillance (2.7% versus 0.9%). Among those infected, residents were less likely to report symptoms than staff. Participants who were vaccinated against seasonal influenza and were current smokers had lower odds of having an infection detected. Active surveillance that includes SARS-CoV-2 testing of all persons is essential in ascertaining the true burden of SARS-CoV-2 infections among residents and staff of congregate settings.
Nosocomial transmission of influenza is a major concern for infection control. We aimed to dissect transmission dynamics of influenza, including asymptomatic transmission events, in acute care.
Design:
Prospective surveillance study during 2 influenza seasons.
Setting:
Tertiary-care hospital.
Participants:
Volunteer sample of inpatients on medical wards and healthcare workers (HCWs).
Methods:
Participants provided daily illness diaries and nasal swabs for influenza A and B detection and whole-genome sequencing for phylogenetic analyses. Contacts between study participants were tracked. Secondary influenza attack rates were calculated based on spatial and temporal proximity and phylogenetic evidence for transmission.
Results:
In total, 152 HCWs and 542 inpatients were included; 16 HCWs (10.5%) and 19 inpatients (3.5%) tested positive for influenza on 109 study days. Study participants had symptoms of disease on most of the days they tested positive for influenza (83.1% and 91.9% for HCWs and inpatients, respectively). Also, 11(15.5%) of 71 influenza-positive swabs among HCWs and 3 (7.9%) of 38 influenza-positive swabs among inpatients were collected on days without symptoms; 2 (12.5%) of 16 HCWs and 2 (10.5%) of 19 inpatients remained fully asymptomatic. The secondary attack rate was low: we recorded 1 transmission event over 159 contact days (0.6%) that originated from a symptomatic case. No transmission event occurred in 61 monitored days of contacts with asymptomatic influenza-positive individuals.
Conclusions:
Influenza in acute care is common, and individuals regularly shed influenza virus without harboring symptoms. Nevertheless, both symptomatic and asymptomatic transmission events proved rare. We suggest that healthcare-associated influenza prevention strategies that are based on preseason vaccination and barrier precautions for symptomatic individuals seem to be effective.
To assess influenza symptoms, adherence to mask use recommendations, absenteesm and presenteeism in acute care healthcare workers (HCWs) during influenza epidemics.
Methods:
The TransFLUas influenza transmission study in acute healthcare prospectively followed HCWs prospectively over 2 consecutive influenza seasons. Symptom diaries asking for respiratory symptoms and adherence with mask use recommendations were recorded on a daily basis, and study participants provided midturbinate nasal swabs for influenza testing.
Results:
In total, 152 HCWs (65.8% nurses and 13.2% physicians) were included: 89.1% of study participants reported at least 1 influenza symptom during their study season and 77.8% suffered from respiratory symptoms. Also, 28.3% of HCW missed at least 1 working day during the study period: 82.6% of these days were missed because of symptoms of influenza illness. Of all participating HCWs, 67.9% worked with symptoms of influenza infection on 8.8% of study days. On 0.3% of study days, symptomatic HCWs were shedding influenza virus while at work. Among HCWs with respiratory symptoms, 74.1% adhered to the policy to wear a mask at work on 59.1% of days with respiratory symptoms.
Conclusions:
Respiratory disease is frequent among HCWs and imposes a significant economic burden on hospitals due to the number of working days lost. Presenteesm with respiratory illness, including influenza, is also frequent and poses a risk for patients and staff.
Acute ischemic stroke may affect women and men differently. We aimed to evaluate sex differences in outcomes of endovascular treatment (EVT) for ischemic stroke due to large vessel occlusion in a population-based study in Alberta, Canada.
Methods and Results:
Over a 3-year period (April 2015–March 2018), 576 patients fit the inclusion criteria of our study and constituted the EVT group of our analysis. The medical treatment group of the ESCAPE trial had 150 patients. Thus, our total sample size was 726. We captured outcomes in clinical routine using administrative data and a linked database methodology. The primary outcome of our study was home-time. Home-time refers to the number of days that the patient was back at their premorbid living situation without an increase in the level of care within 90 days of the index stroke event. In adjusted analysis, EVT was associated with an increase of 90-day home-time by an average of 6.08 (95% CI −2.74–14.89, p-value 0.177) days in women compared to an average of 11.20 (95% CI 1.94–20.46, p-value 0.018) days in men. Further analysis revealed that the association between EVT and 90-day home-time in women was confounded by age and onset-to-treatment time.
Conclusions:
We found a nonsignificant nominal reduction of 90-day home-time gain for women compared to men in this province-wide population-based study of EVT for large vessel occlusion, which was only partially explained by confounding.
Emergency Medical Services (EMS) systems have developed protocols for prehospital activation of the cardiac catheterization laboratory for patients with suspected ST-elevation myocardial infarction (STEMI) to decrease first-medical-contact-to-balloon time (FMC2B). The rate of “false positive” prehospital activations is high. In order to decrease this rate and expedite care for patients with true STEMI, the American Heart Association (AHA; Dallas, Texas USA) developed the Mission Lifeline PreAct STEMI algorithm, which was implemented in Los Angeles County (LAC; California USA) in 2015. The hypothesis of this study was that implementation of the PreAct algorithm would increase the positive predictive value (PPV) of prehospital activation.
Methods:
This is an observational pre-/post-study of the effect of the implementation of the PreAct algorithm for patients with suspected STEMI transported to one of five STEMI Receiving Centers (SRCs) within the LAC Regional System. The primary outcome was the PPV of cardiac catheterization laboratory activation for percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). The secondary outcome was FMC2B.
Results:
A total of 1,877 patients were analyzed for the primary outcome in the pre-intervention period and 405 patients in the post-intervention period. There was an overall decrease in cardiac catheterization laboratory activations, from 67% in the pre-intervention period to 49% in the post-intervention period (95% CI for the difference, -14% to -22%). The overall rate of cardiac catheterization declined in post-intervention period as compared the pre-intervention period, from 34% to 30% (95% CI, for the difference -7.6% to 0.4%), but actually increased for subjects who had activation (48% versus 58%; 95% CI, 4.6%-15.0%). Implementation of the PreAct algorithm was associated with an increase in the PPV of activation for PCI or CABG from 37.9% to 48.6%. The overall odds ratio (OR) associated with the intervention was 1.4 (95% CI, 1.1-1.8). The effect of the intervention was to decrease variability between medical centers. There was no associated change in average FMC2B.
Conclusions:
The implementation of the PreAct algorithm in the LAC EMS system was associated with an overall increase in the PPV of cardiac catheterization laboratory activation.
Previous cross-lagged studies on depression and memory impairment among the elderly have revealed conflicting findings relating to the direction of influence between depression and memory impairment. The current study aims to clarify this direction of influence by examining the cross-lagged relationships between memory impairment and depression in an Asian sample of elderly community dwellers, as well as synthesizing previous relevant cross-lagged findings via a meta-analysis.
Methods
A total of 160 participants (Mage = 68.14, s.d. = 5.34) were assessed across two time points (average of 1.9 years apart) on measures of memory and depressive symptoms. The data were then fitted to a structural equation model to examine two cross-lagged effects (i.e. depressive symptoms→memory; memory→depressive symptoms). A total of 14 effect-sizes for each of the two cross-lagged directions were extracted from six studies (including the present; total N = 8324). These effects were then meta-analyzed using a three-level mixed effects model.
Results
In the current sample, lower memory ability at baseline was associated with worse depressive symptoms levels at follow-up, after controlling for baseline depressive symptoms. However, the reverse effect was not significant; baseline depressive symptoms did not predict subsequent memory ability after controlling for baseline memory. The results of the meta-analysis revealed the same pattern of relationship between memory and depressive symptoms.
Conclusions
These results provide robust evidence that the relationship between memory impairment and depressive symptoms is unidirectional; memory impairment predicts subsequent depressive symptoms but not vice-versa. The implications of these findings are discussed
A method of calculating confidence intervals of the “area of influence” of a weed plant, and of yield losses calculated from it, was developed. In a worked example using published data, the confidence intervals of the area of influence were found to be large. Yield losses calculated from this method were less precisely estimated than those from a more traditional additive density experiment. This limited evidence suggests that to give similar precision, the area of influence experiments may need to be at least double their present size. If this is indeed the case, published statements on the space, time, and effort advantages of the area of influence design will need to be treated with caution.
A significant minority of people presenting with a major depressive episode (MDE) experience co-occurring subsyndromal hypo/manic symptoms. As this presentation may have important prognostic and treatment implications, the DSM–5 codified a new nosological entity, the “mixed features specifier,” referring to individuals meeting threshold criteria for an MDE and subthreshold symptoms of (hypo)mania or to individuals with syndromal mania and subthreshold depressive symptoms. The mixed features specifier adds to a growing list of monikers that have been put forward to describe phenotypes characterized by the admixture of depressive and hypomanic symptoms (e.g., mixed depression, depression with mixed features, or depressive mixed states [DMX]). Current treatment guidelines, regulatory approvals, as well the current evidentiary base provide insufficient decision support to practitioners who provide care to individuals presenting with an MDE with mixed features. In addition, all existing psychotropic agents evaluated in mixed patients have largely been confined to patient populations meeting the DSM–IV definition of “mixed states” wherein the co-occurrence of threshold-level mania and threshold-level MDE was required. Toward the aim of assisting clinicians providing care to adults with MDE and mixed features, we have assembled a panel of experts on mood disorders to develop these guidelines on the recognition and treatment of mixed depression, based on the few studies that have focused specifically on DMX as well as decades of cumulated clinical experience.
Weed maps are typically produced from data sampled at discrete intervals on a regular grid. Errors are expected to occur as data are sampled at increasingly coarse scales. To demonstrate the potential effect of sampling strategy on the quality of weed maps, we analyzed a data set comprising the counts of capeweed in 225,000 quadrats completely covering a 0.9-ha area. The data were subsampled at different grid spacings, quadrat sizes, and starting points and were then used to produce maps by kriging. Spacings of 10 m were found to overestimate the geostatistical range by 100% and missed details apparently resulting from the spraying equipment. Some evidence was found supporting the rule of thumb that surveys should be conducted at a spacing of about half the scale of interest. Quadrat size had less effect than spacing on the map quality. At wider spacings the starting position of the sample grid had a considerable effect on the qualities of the maps but not on the estimated geostatistical range. Continued use of arbitrary survey designs is likely to miss the information of interest to biologists and may possibly produce maps inappropriate to spray application technology.
This paper argues that the EU regulatory practice in the food area may be unnecessarily applying the Precautionary Principle by focussing on upper intake limits for naturally occurring nutrients, while not controlling the quality of the ingredients used in commercial products even though precedents of public health issues arising from adulterated ingredients do exist. Risk governance depends heavily on expert evidence and the case of amino acid supplements is used to document an industry-supported effort to strengthen the science database and thus enhance the regulatory process: Thus ensuring that amino acid use in the EU is safely and proportionately regulated. Scientific work conducted in the last decade by the not-for-profit association, the International Council on Amino Acid Science (ICAAS) is used as a simple case study highlighting the role of proactive clinical research in an era characterized by precaution in risk management, and by the escalating costs of scientific research, and the growing influence of the internet.
Recent studies suggest that sand can serve as a vehicle for exposure of humans to pathogens at beach sites, resulting in increased health risks. Sampling for microorganisms in sand should therefore be considered for inclusion in regulatory programmes aimed at protecting recreational beach users from infectious disease. Here, we review the literature on pathogen levels in beach sand, and their potential for affecting human health. In an effort to provide specific recommendations for sand sampling programmes, we outline published guidelines for beach monitoring programmes, which are currently focused exclusively on measuring microbial levels in water. We also provide background on spatial distribution and temporal characteristics of microbes in sand, as these factors influence sampling programmes. First steps toward establishing a sand sampling programme include identifying appropriate beach sites and use of initial sanitary assessments to refine site selection. A tiered approach is recommended for monitoring. This approach would include the analysis of samples from many sites for faecal indicator organisms and other conventional analytes, while testing for specific pathogens and unconventional indicators is reserved for high-risk sites. Given the diversity of microbes found in sand, studies are urgently needed to identify the most significant aetiological agent of disease and to relate microbial measurements in sand to human health risk.