We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
An important component of post-release monitoring of biological control of invasive plants is the tracking of species interactions. During post-release monitoring following the initial releases of the weevil Ceutorhynchus scrobicollis Nerenscheimer and Wagner (Coleoptera: Curculionidae) on garlic mustard, Alliaria petiolata (Marschall von Bieberstein) Cavara and Grande (Brassicaceae), in Ontario, Canada, we identified the presence of larvae of the tumbling flower beetle, Mordellina ancilla Leconte (Coleoptera: Mordellidae), in garlic mustard stems. This study documents the life history of M. ancilla on garlic mustard to assess for potential interactions between M. ancilla and C. scrobicollis as a biological control agent. Garlic mustard stems were sampled at eight sites across southern Ontario and throughout the course of one year to record the prevalence of this association and to observe its life cycle on the plant. We found M. ancilla to be a widespread stem-borer of late second–year and dead garlic mustard plants across sampling locations. This is the first host record for M. ancilla on garlic mustard. The observed life cycle of M. ancilla indicates that it is unlikely to negatively impact the growth and reproduction of garlic mustard and that it is unlikely to affect the use of C. scrobicollis as a biological control agent.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Pragmatic trials aim to speed translation to practice by integrating study procedures in routine care settings. This study evaluated implementation outcomes related to clinician and patient recruitment and participation in a trial of community paramedicine (CP) and presents successes and challenges of maintaining pragmatic study features.
Methods:
Adults in the pre-hospital setting, emergency department (ED), or hospital being considered for referral to the ED/hospital or continued hospitalization for intermediate-level care were randomized 1:1 to CP care or usual care. Referral and enrollment data were tracked administratively, and patient characteristics were abstracted from the electronic health record (EHR). Enrolled patients completed baseline surveys, and a subset of intervention patients were interviewed. All CPs and a sample of clinicians and administrators were invited to complete a survey and interview.
Results:
Between January 2022 and February 2023, 240 enrolled patients (42% rural) completed surveys, and 22 completed an interview; 63 staff completed surveys and 20 completed an interview. Ninety-three clinicians in 27 departments made at least one referral. Factors related to referrals included program awareness and understanding the CP practice scope. Most patients were enrolled in the hospital, but characteristics were similar to the primary care population and included older and medically complex patients. Challenges to achieving representativeness included limited EHR infrastructure, constraints related to patient consenting, and clinician concerns about patient randomization disrupting preferred care.
Conclusion:
Future pragmatic trials in busy clinical settings may benefit from regulatory policies and EHR capabilities that allow for real-world study conduct and representative participation. Trial registration: NCT05232799.
Cognitive training has shown promise for improving cognition in older adults. Aging involves a variety of neuroanatomical changes that may affect response to cognitive training. White matter hyperintensities (WMH) are one common age-related brain change, as evidenced by T2-weighted and Fluid Attenuated Inversion Recovery (FLAIR) MRI. WMH are associated with older age, suggestive of cerebral small vessel disease, and reflect decreased white matter integrity. Higher WMH load associates with reduced threshold for clinical expression of cognitive impairment and dementia. The effects of WMH on response to cognitive training interventions are relatively unknown. The current study assessed (a) proximal cognitive training performance following a 3-month randomized control trial and (b) the contribution of baseline whole-brain WMH load, defined as total lesion volume (TLV), on pre-post proximal training change.
Participants and Methods:
Sixty-two healthy older adults ages 65-84 completed either adaptive cognitive training (CT; n=31) or educational training control (ET; n=31) interventions. Participants assigned to CT completed 20 hours of attention/processing speed training and 20 hours of working memory training delivered through commercially-available Posit Science BrainHQ. ET participants completed 40 hours of educational videos. All participants also underwent sham or active transcranial direct current stimulation (tDCS) as an adjunctive intervention, although not a variable of interest in the current study. Multimodal MRI scans were acquired during the baseline visit. T1- and T2-weighted FLAIR images were processed using the Lesion Segmentation Tool (LST) for SPM12. The Lesion Prediction Algorithm of LST automatically segmented brain tissue and calculated lesion maps. A lesion threshold of 0.30 was applied to calculate TLV. A log transformation was applied to TLV to normalize the distribution of WMH. Repeated-measures analysis of covariance (RM-ANCOVA) assessed pre/post change in proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures in the CT group compared to their ET counterparts, controlling for age, sex, years of education and tDCS group. Linear regression assessed the effect of TLV on post-intervention proximal composite and sub-composite, controlling for baseline performance, intervention assignment, age, sex, years of education, multisite scanner differences, estimated total intracranial volume, and binarized cardiovascular disease risk.
Results:
RM-ANCOVA revealed two-way group*time interactions such that those assigned cognitive training demonstrated greater improvement on proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures compared to their ET counterparts. Multiple linear regression showed higher baseline TLV associated with lower pre-post change on Processing Speed Training sub-composite (ß = -0.19, p = 0.04) but not other composite measures.
Conclusions:
These findings demonstrate the utility of cognitive training for improving postintervention proximal performance in older adults. Additionally, pre-post proximal processing speed training change appear to be particularly sensitive to white matter hyperintensity load versus working memory training change. These data suggest that TLV may serve as an important factor for consideration when planning processing speed-based cognitive training interventions for remediation of cognitive decline in older adults.
Aging is associated with disruptions in functional connectivity within the default mode (DMN), frontoparietal control (FPCN), and cingulo-opercular (CON) resting-state networks. Greater within-network connectivity predicts better cognitive performance in older adults. Therefore, strengthening network connectivity, through targeted intervention strategies, may help prevent age-related cognitive decline or progression to dementia. Small studies have demonstrated synergistic effects of combining transcranial direct current stimulation (tDCS) and cognitive training (CT) on strengthening network connectivity; however, this association has yet to be rigorously tested on a large scale. The current study leverages longitudinal data from the first-ever Phase III clinical trial for tDCS to examine the efficacy of an adjunctive tDCS and CT intervention on modulating network connectivity in older adults.
Participants and Methods:
This sample included 209 older adults (mean age = 71.6) from the Augmenting Cognitive Training in Older Adults multisite trial. Participants completed 40 hours of CT over 12 weeks, which included 8 attention, processing speed, and working memory tasks. Participants were randomized into active or sham stimulation groups, and tDCS was administered during CT daily for two weeks then weekly for 10 weeks. For both stimulation groups, two electrodes in saline-soaked 5x7 cm2 sponges were placed at F3 (cathode) and F4 (anode) using the 10-20 measurement system. The active group received 2mA of current for 20 minutes. The sham group received 2mA for 30 seconds, then no current for the remaining 20 minutes.
Participants underwent resting-state fMRI at baseline and post-intervention. CONN toolbox was used to preprocess imaging data and conduct region of interest (ROI-ROI) connectivity analyses. The Artifact Detection Toolbox, using intermediate settings, identified outlier volumes. Two participants were excluded for having greater than 50% of volumes flagged as outliers. ROI-ROI analyses modeled the interaction between tDCS group (active versus sham) and occasion (baseline connectivity versus postintervention connectivity) for the DMN, FPCN, and CON controlling for age, sex, education, site, and adherence.
Results:
Compared to sham, the active group demonstrated ROI-ROI increases in functional connectivity within the DMN following intervention (left temporal to right temporal [T(202) = 2.78, pFDR < 0.05] and left temporal to right dorsal medial prefrontal cortex [T(202) = 2.74, pFDR < 0.05]. In contrast, compared to sham, the active group demonstrated ROI-ROI decreases in functional connectivity within the FPCN following intervention (left dorsal prefrontal cortex to left temporal [T(202) = -2.96, pFDR < 0.05] and left dorsal prefrontal cortex to left lateral prefrontal cortex [T(202) = -2.77, pFDR < 0.05]). There were no significant interactions detected for CON regions.
Conclusions:
These findings (a) demonstrate the feasibility of modulating network connectivity using tDCS and CT and (b) provide important information regarding the pattern of connectivity changes occurring at these intervention parameters in older adults. Importantly, the active stimulation group showed increases in connectivity within the DMN (a network particularly vulnerable to aging and implicated in Alzheimer’s disease) but decreases in connectivity between left frontal and temporal FPCN regions. Future analyses from this trial will evaluate the association between these changes in connectivity and cognitive performance post-intervention and at a one-year timepoint.
Nonpathological aging has been linked to decline in both verbal and visuospatial memory abilities in older adults. Disruptions in resting-state functional connectivity within well-characterized, higherorder cognitive brain networks have also been coupled with poorer memory functioning in healthy older adults and in older adults with dementia. However, there is a paucity of research on the association between higherorder functional connectivity and verbal and visuospatial memory performance in the older adult population. The current study examines the association between resting-state functional connectivity within the cingulo-opercular network (CON), frontoparietal control network (FPCN), and default mode network (DMN) and verbal and visuospatial learning and memory in a large sample of healthy older adults. We hypothesized that greater within-network CON and FPCN functional connectivity would be associated with better immediate verbal and visuospatial memory recall. Additionally, we predicted that within-network DMN functional connectivity would be associated with improvements in delayed verbal and visuospatial memory recall. This study helps to glean insight into whether within-network CON, FPCN, or DMN functional connectivity is associated with verbal and visuospatial memory abilities in later life.
Participants and Methods:
330 healthy older adults between 65 and 89 years old (mean age = 71.6 ± 5.2) were recruited at the University of Florida (n = 222) and the University of Arizona (n = 108). Participants underwent resting-state fMRI and completed verbal memory (Hopkins Verbal Learning Test - Revised [HVLT-R]) and visuospatial memory (Brief Visuospatial Memory Test - Revised [BVMT-R]) measures. Immediate (total) and delayed recall scores on the HVLT-R and BVMT-R were calculated using each test manual’s scoring criteria. Learning ratios on the HVLT-R and BVMT-R were quantified by dividing the number of stimuli (verbal or visuospatial) learned between the first and third trials by the number of stimuli not recalled after the first learning trial. CONN Toolbox was used to extract average within-network connectivity values for CON, FPCN, and DMN. Hierarchical regressions were conducted, controlling for sex, race, ethnicity, years of education, number of invalid scans, and scanner site.
Results:
Greater CON connectivity was significantly associated with better HVLT-R immediate (total) recall (ß = 0.16, p = 0.01), HVLT-R learning ratio (ß = 0.16, p = 0.01), BVMT-R immediate (total) recall (ß = 0.14, p = 0.02), and BVMT-R delayed recall performance (ß = 0.15, p = 0.01). Greater FPCN connectivity was associated with better BVMT-R learning ratio (ß = 0.13, p = 0.04). HVLT-R delayed recall performance was not associated with connectivity in any network, and DMN connectivity was not significantly related to any measure.
Conclusions:
Connectivity within CON demonstrated a robust relationship with different components of memory function as well across verbal and visuospatial domains. In contrast, FPCN only evidenced a relationship with visuospatial learning, and DMN was not significantly associated with memory measures. These data suggest that CON may be a valuable target in longitudinal studies of age-related memory changes, but also a possible target in future non-invasive interventions to attenuate memory decline in older adults.
The UK's services for adult attention-deficit hyperactivity disorder (ADHD) are in crisis, with demand outstripping capacity and waiting times reaching unprecedented lengths. Recognition of and treatments for ADHD have expanded over the past two decades, increasing clinical demand. This issue has been exacerbated by the COVID-19 pandemic. Despite an increase in specialist services, resource allocation has not kept pace, leading to extended waiting times. Underfunding has encouraged growth in independent providers, leading to fragmentation of service provision. Treatment delays carry a human and financial cost, imposing a burden on health, social care and the criminal justice system. A rethink of service procurement and delivery is needed, with multiple solutions on the table, including increasing funding, improving system efficiency, altering the service provision model and clinical prioritisation. However, the success of these solutions hinges on fiscal capacity and workforce issues.
Novel ST398 methicillin susceptible Staphylococcus aureus (MSSA) in the United States was first observed in New York City (2004–2007); its diffusion across the country resulted in changing treatment options. Utilizing outpatient antimicrobial susceptibility data from the Veterans Health Administration from 2010 to 2019, the spatiotemporal prevalence of potential ST398 MSSA is documented.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Effective management of the introduced invasive grass common reed [Phragmites australis (Cav.) Trin. ex Steud.] requires the ability to differentiate between the introduced and native subspecies found in North America. While genetic tools are useful for discriminating between the subspecies, morphological identification is a useful complementary approach that is low to zero cost and does not require specialized equipment or technical expertise. The objective of our study was to identify the best morphological traits for rapid and simple identification of native and introduced P. australis. A suite of 22 morphological traits were measured in 21 introduced and 27 native P. australis populations identified by genetic barcoding across southern Ontario, Canada. Traits were compared between the subspecies to identify measurements that offered reliable, diagnostic separation. Overall, 21 of the 22 traits differed between the subspecies, with four offering complete separation: the retention of leaf sheaths on dead stems; a categorical assessment of stem color; the base height of the ligule, excluding the hairy fringe; and a combined measurement of leaf length and lower glume length. Additionally, round fungal spots on the stem occurred only on the native subspecies and never on the sampled introduced populations. The high degree of variation observed in traits within and between the subspecies cautions against a “common wisdom” approach to identification or automatic interpretation of intermediate traits as indicative of aberrant populations or hybridization. As an alternative, we have compiled the five best traits into a checklist of simple and reliable measurements to identify native and introduced P. australis. This guide will be most applicable for samples collected in the late summer and fall in the Great Lakes region but can also inform best practices for morphological identification in other regions as well.
Urine cultures collected from catheterized patients have a high likelihood of false-positive results due to colonization. We examined the impact of a clinical decision support (CDS) tool that includes catheter information on test utilization and patient-level outcomes.
Methods:
This before-and-after intervention study was conducted at 3 hospitals in North Carolina. In March 2021, a CDS tool was incorporated into urine-culture order entry in the electronic health record, providing education about indications for culture and suggesting catheter removal or exchange prior to specimen collection for catheters present >7 days. We used an interrupted time-series analysis with Poisson regression to evaluate the impact of CDS implementation on utilization of urinalyses and urine cultures, antibiotic use, and other outcomes during the pre- and postintervention periods.
Results:
The CDS tool was prompted in 38,361 instances of urine cultures ordered in all patients, including 2,133 catheterized patients during the postintervention study period. There was significant decrease in urine culture orders (1.4% decrease per month; P < .001) and antibiotic use for UTI indications (2.3% decrease per month; P = .006), but there was no significant decline in CAUTI rates in the postintervention period. Clinicians opted for urinary catheter removal in 183 (8.5%) instances. Evaluation of the safety reporting system revealed no apparent increase in safety events related to catheter removal or reinsertion.
Conclusion:
CDS tools can aid in optimizing urine culture collection practices and can serve as a reminder for removal or exchange of long-term indwelling urinary catheters at the time of urine-culture collection.
Archaeologists tend to produce slow data that is contextually rich but often difficult to generalize. An example is the analysis of lithic microdebitage, or knapping debris, that is smaller than 6.3 mm (0.25 in.). So far, scholars have relied on manual approaches that are prone to intra- and interobserver errors. In the following, we present a machine learning–based alternative together with experimental archaeology and dynamic image analysis. We use a dynamic image particle analyzer to measure each particle in experimentally produced lithic microdebitage (N = 5,299) as well as an archaeological soil sample (N = 73,313). We have developed four machine learning models based on Naïve Bayes, glmnet (generalized linear regression), random forest, and XGBoost (“Extreme Gradient Boost[ing]”) algorithms. Hyperparameter tuning optimized each model. A random forest model performed best with a sensitivity of 83.5%. It misclassified only 28 or 0.9% of lithic microdebitage. XGBoost models reached a sensitivity of 67.3%, whereas Naïve Bayes and glmnet models stayed below 50%. Except for glmnet models, transparency proved to be the most critical variable to distinguish microdebitage. Our approach objectifies and standardizes microdebitage analysis. Machine learning allows studying much larger sample sizes. Algorithms differ, though, and a random forest model offers the best performance so far.
Neonates and infants who undergo congenital cardiac surgery frequently have difficulty with feeding. The factors that predispose these patients to require a gastrostomy tube have not been well defined. We aimed to report the incidence and describe hospital outcomes and characteristics in neonates and infants undergoing congenital cardiac surgery who required gastrostomy tube placement.
Materials and method:
A retrospective review was performed on patients undergoing congenital cardiac surgery between October 2015 and December 2020. Patients were identified by International Classification of Diseases 10th Revision codes, utilising the performance improvement database Vizient® Clinical Data Base, and stratified by age at admission: neonates (<1 month) and infants (1–12 months). Outcomes were compared and comparative analysis performed between admissions with and without gastrostomy tube placement.
Results:
There were 11,793 admissions, 3519 (29.8%) neonates and 8274 (70.2%) infants. We found an increased incidence of gastrostomy tube placement in neonates as compared to infants following congenital cardiac surgery (23.1% versus 6%, p = <0.001). Outcomes in neonates and infants were similar with increased length of stay and cost in those requiring a gastrostomy tube. Gastrostomy tube placement was noted to be more likely in neonates and infants with upper airway anomalies, congenital abnormalities, hospital infections, and genetic abnormalities.
Discussion:
Age at hospitalisation for congenital cardiac surgery is a definable risk factor for gastrostomy tube requirement. Additional factors contribute to gastrostomy tube placement and should be used when counselling families regarding the potential requirement of a gastrostomy tube.
This paper presents a deep-learning-based workflow to detect synapses and predict their neurotransmitter type in the primitive chordate Ciona intestinalis (Ciona) electron microscopic (EM) images. Identifying synapses from EM images to build a full map of connections between neurons is a labor-intensive process and requires significant domain expertise. Automation of synapse classification would hasten the generation and analysis of connectomes. Furthermore, inferences concerning neuron type and function from synapse features are in many cases difficult to make. Finding the connection between synapse structure and function is an important step in fully understanding a connectome. Class Activation Maps derived from the convolutional neural network provide insights on important features of synapses based on cell type and function. The main contribution of this work is in the differentiation of synapses by neurotransmitter type through the structural information in their EM images. This enables the prediction of neurotransmitter types for neurons in Ciona, which were previously unknown. The prediction model with code is available on GitHub.
To describe pediatric outpatient visits and antibiotic prescribing during the coronavirus disease 2019 (COVID-19) pandemic.
Design:
An observational, retrospective control study from January 2019 to October 2021.
Setting:
Outpatient clinics, including 27 family medicine clinics, 27 pediatric clinics, and 26 urgent or prompt care clinics.
Patients:
Children aged 0–19 years receiving care in an outpatient setting.
Methods:
Data were extracted from the electronic health record. The COVID-19 era was defined as April 1, 2020, to October 31, 2021. Virtual visits were identified by coded encounter or visit type variables. Visit diagnoses were assigned using a 3-tier classification system based on appropriateness of antibiotic prescribing and a subanalysis of respiratory visits was performed to compare changes in the COVID-19 era compared to baseline.
Results:
Through October 2021, we detected an overall sustained reduction of 18.2% in antibiotic prescribing to children. Disproportionate changes occurred in the percentages of antibiotic visits in respiratory visits for children by age, race or ethnicity, practice setting, and prescriber type. Virtual visits were minimal during the study period but did not result in higher rates of antibiotic visits or in-person follow-up visits.
Conclusions:
These findings suggest that reductions in antibiotic prescribing have been sustained despite increases in outpatient visits. However, additional studies are warranted to better understand disproportionate rates of antibiotic visits.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
Field studies were conducted in North Carolina in 2018 and 2019 to determine sweetpotato tolerance to indaziflam and its effectiveness in controlling Palmer amaranth in sweetpotato. Treatments included indaziflam pre-transplant; 7 d after transplanting (DATr) or 14 DATr at 29, 44, 58, or 73 g ai ha−1; and checks (weedy and weed-free). Indaziflam applied postemergence caused transient foliar injury to sweetpotato. Indaziflam pretransplant caused less injury to sweetpotato than other application timings regardless of rate. Palmer amaranth control was greatest when indaziflam was applied pretransplant or 7 DATr. In a weed-free environment, sweetpotato marketable yield decreased as indaziflam application was delayed. No differences in storage root length to width ratio were observed.
Monoclonal antibody therapeutics to treat coronavirus disease (COVID-19) have been authorized by the US Food and Drug Administration under Emergency Use Authorization (EUA). Many barriers exist when deploying a novel therapeutic during an ongoing pandemic, and it is critical to assess the needs of incorporating monoclonal antibody infusions into pandemic response activities. We examined the monoclonal antibody infusion site process during the COVID-19 pandemic and conducted a descriptive analysis using data from 3 sites at medical centers in the United States supported by the National Disaster Medical System. Monoclonal antibody implementation success factors included engagement with local medical providers, therapy batch preparation, placing the infusion center in proximity to emergency services, and creating procedures resilient to EUA changes. Infusion process challenges included confirming patient severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) positivity, strained staff, scheduling, and pharmacy coordination. Infusion sites are effective when integrated into pre-existing pandemic response ecosystems and can be implemented with limited staff and physical resources.