We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Evaluate impact of COVID-19 prevention training with video-based feedback on nursing home (NH) staff safety behaviors.
Design:
Public health intervention
Setting & Participants:
Twelve NHs in Orange County, California, 6/2020-4/2022
Methods:
NHs received direct-to-staff COVID-19 prevention training and weekly feedback reports with video montages about hand hygiene, mask-wearing, and mask/face-touching. One-hour periods of recorded streaming video from common areas (breakroom, hallway, nursing station, entryway) were sampled randomly across days of the week and nursing shifts for safe behavior. Multivariable models assessed the intervention impact.
Results:
Video auditing encompassed 182,803 staff opportunities for safe behavior. Hand hygiene errors improved from first (67.0%) to last (35.7%) months of the intervention, decreasing 7.6% per month (OR = 0.92, 95% CI = 0.92–0.93, P < 0.001); masking errors improved from first (10.3 %) to last (6.6%) months of the intervention, decreasing 2.3% per month (OR = 0.98, 95% CI = 0.97–0.99, P < 0.001); face/mask touching improved from first (30.0%) to last (10.6%) months of the intervention, decreasing 2.5% per month (OR = 0.98, 95% CI = 0.97–0.98, P < 0.001). Hand hygiene errors were most common in entryways and on weekends, with similar rates across shifts. Masking errors and face/mask touching errors were most common in breakrooms, with the latter occurring most commonly during the day (7A.M.–3P.M.) shift, with similar rates across weekdays/weekends. Error reductions were seen across camera locations, days of the week, and nursing shifts, suggesting a widespread benefit within participating NHs.
Conclusion:
Direct-to-staff training with video-based feedback was temporally associated with improved hand hygiene, masking, and face/mask-touching behaviors among NH staff during the COVID-19 pandemic.
Herbaceous perennials must annually rebuild the aboveground photosynthetic architecture from carbohydrates stored in crowns, rhizomes, and roots. Knowledge of carbohydrate utilization and storage can inform management decisions and improve control outcomes for invasive perennials. We monitored the nonstructural carbohydrates in a population of the hybrid Bohemian knotweed [Polygonum ×bohemicum (J. Chrtek & Chrtková) Zika & Jacobson [cuspidatum × sachalinense]; syn.: Fallopia ×bohemica (Chrtek and Chrtková) J.P. Bailey] and in Japanese knotweed [Polygonum cuspidatum Siebold & Zucc.; syn.: Fallopia japonica (Houtt.) Ronse Decr.]. Carbohydrate storage in crowns followed seasonal patterns typical of perennial herbaceous dicots corresponding to key phenological events. Starch was consistently the highest nonstructural carbohydrate present. Sucrose levels did not show a consistent inverse relationship with starch levels. Lateral distribution of starch in rhizomes and, more broadly, total nonstructural carbohydrates sampled before dormancy break showed higher levels in rhizomes compared with crowns. Total nonstructural carbohydrate levels in crowns reached seasonal lows at an estimated 22.6% of crown dry weight after accumulating 1,453.8 growing degree days (GDD) by the end of June, mainly due to depleted levels of stored starch, with the estimated minimum of 12.3% reached by 1,220.3 GDD accumulated by mid-June. Depletion corresponded to rapid development of vegetative canopy before entering the reproductive phase in August. Maximum starch accumulation in crowns followed complete senescence of aboveground tissues by mid- to late October. Removal of aboveground shoot biomass in late June to early July with removal of vegetation regrowth in early September before senescence would optimize the use of time and labor to deplete carbohydrate reserves. Additionally, foliar-applied systemic herbicide translocation to belowground tissue should be maximized with applications in late August through early fall to optimize downward translocation with assimilate movement to rebuild underground storage reserves. Fall applications should be made before loss of healthy leaf tissue, with the window for control typically ending by late September in Minnesota.
We provide an assessment of the Infinity Two Fusion Pilot Plant (FPP) baseline plasma physics design. Infinity Two is a four-field period, aspect ratio A = 10, quasi-isodynamic stellarator with improved confinement appealing to a max-J approach, elevated plasma density and high magnetic fields (⟨B⟩ = 9 T). At the envisioned operating point [800 MW deuterium-tritium (DT) fusion], the configuration has robust magnetic surfaces based on magnetohydrodynamic (MHD) equilibrium calculations and is stable to both local and global MHD instabilities. The configuration has excellent confinement properties with small neoclassical transport and low bootstrap current (|Ibootstrap| ∼ 2 kA). Calculations of collisional alpha particle confinement in a DT FPP scenario show small energy losses to the first wall (< 1.5%) and stable energetic particle/Alfvén eigenmodes at high ion density. Low turbulent transport is produced using a combination of density profile control consistent with pellet fueling and reduced stiffness to turbulent transport via three-dimensional shaping. Transport simulations with the T3D-GX-SFINCS code suite with self-consistent turbulent and neoclassical transport predict that the Pfus = 800 MW operating point is attainable with high fusion gain (Q = 40) at volume-averaged electron densities ne ≈ 2×1020 m−3, below the Sudo density limit. Additional transport calculations show that an ignited (Q = ∞) solution is available at slightly higher density (2.2×1020 m−3) with Pfus = 1.5 GW. The magnetic configuration is defined by a magnetic coil set with sufficient room for an island divertor, shielding and blanket solutions with tritium breeding ratios (TBR) above unity. An optimistic estimate for the gas-cooled solid breeder designed Helium Cooled Pebble Bed is TBR ∼ 1.3. Infinity Two satisfies the physics requirements of a stellarator fusion pilot plant.
The magneto-hydrodynamic equilibrium and stability properties of the Infinity Two Fusion Pilot Plant baseline plasma physics design are presented. The configuration is a four field period, aspect ratio A = 10 quasi-isodynamic stellarator optimized for excellent confinement at elevated density and high magnetic field B = 9 T. Magnetic surfaces exist in the plasma core in vacuum and retain good equilibrium surface integrity from vacuum to an operational β = 1.6%, the ratio of the volume average of the plasma and magnetic pressures, corresponding to 800 MW Deuterium-Tritium fusion operation. Neoclassical calculations show that a selfconsistent bootstrap current on the order of ∼ 1 kA slightly increases the rotational transform profile by less than 0.001. The configuration has a magnetic well across its entire radius. From vacuum to the operating point, the configuration exhibits good ballooning stability characteristics, exhibits good Mercier stability across most of its minor radius, and it is stable against global low-n MHD instabilities up to β = 3.2%.
The selection, design, and optimization of a suitable blanket configuration for an advanced high-field stellarator concept is seen as a key feasibility issue and has been incorporated as a vital and necessary part of the Infinity Two Fusion Pilot Plant (FPP) physics basis. The focus of this work was to identify a baseline blanket which can be rapidly deployed for Infinity Two while also maintaining flexibility and opportunities for higher performing concepts later in development. Results from this analysis indicate that gas-cooled solid breeder designs such as the Helium Cooled Pebble Bed (HCPB) are the most promising concepts, primarily motivated by the neutronics performance at applicable blanket build depths, and the relatively mature technology basis. The lithium lead (PbLi) family of concepts, particularly the Dual Cooled Lithium Lead (DCLL), offer a compelling alternative to solid blanket concepts as they have synergistic developmental pathways while simultaneously mitigating much of the technical risk of those designs. Homogenized 3-dimensional neutronics analysis of the Infinity Two configuration indicates that the HCPB achieves an adequate tritium breeding ratio (TBR) (1.30 which enables sufficient margin at low engineering fidelity), and near appropriate shielding of the magnets (average fast fluence of 1.3 x 1018 n/cm2 per fullpower year). The thermal analysis indicates that reasonably high thermal efficiencies (greater than 30%) are readily achievable with the HCPB paired with a simple Rankine cycle using reheat. Finally, the tritium fuel cycle analysis for Infinity Two shows viability, with anticipated operational inventories of less than one kilogram (approximately 675 grams) and a required TBR (TBRreq) of less than 1.05 to maintain fuel self-sufficiency (approximately 1.023 for a driver blanket with no inventory doubling). Although further optimization and engineering design is still required, at the physics basis stage all initial targets have been met for the Infinity Two configuration.
Accountable care models for Medicaid reimbursement aim to improve care quality and reduce costs by linking payments to performance. Oregon’s coordinated care organizations (CCOs) assume financial responsibility for their members and are incentivized to help clinics improve performance on specific quality metrics. This study explores how Oregon’s CCO model influences partnerships between payers and primary care clinics, focusing on strategies used to enhance screening and treatment for unhealthy alcohol use (UAU).
Methods:
In this qualitative study, we conducted semi-structured interviews with informants from 12 of 13 Oregon CCOs active in 2019 and 2020. The interviews focused on payer–provider partnerships, specifically around UAU screening and treatment, which is a long-standing CCO metric. We used thematic analysis to identify key themes and causal-loop diagramming to uncover feedback dynamics and communicate key findings. Meadows’ leverage point framework was applied to categorize findings based on their potential to drive change.
Results:
CCO strategies to support clinics included building relationships, reporting on metric progress, providing EHR technical assistance, offering training, and implementing alternative payment methods. CCOs prioritized clinics with more members and those highly motivated. Our analysis showed that while the CCO model aligned goals between payers and clinics, it may perpetuate rural disparities by prioritizing larger, better-resourced clinics.
Conclusions:
Oregon’s CCO model fosters partnerships centered on quality metrics but may unintentionally reinforce rural disparities by incentivizing support for larger clinics. Applying the Meadows framework highlighted leverage points within these partnerships.
Primary hyperparathyroidism (PHPT) is the presence of hypercalcaemia with an elevated or inappropriately normal parathyroid hormone level. In clinical psychiatry this is often detected on routine blood investigations. This article aims to help mental health professionals understand the relevance of PHPT to psychiatry and offers some guidance about further management of patients presenting with this endocrine abnormality in mental health settings. PHPT can be associated with both mental and physical health problems in some individuals, making it a crucial diagnosis that should not be overlooked.
Since COVID-19, mental healthcare telehealth services have increased. A 2021 online survey (TeleSCOPE 1.0 [T1]) identified challenges evaluating, diagnosing, and monitoring DIMDs with telehealth (via video or phone). TeleSCOPE 2.0 (T2) was conducted to understand the telehealth impact post-COVID restrictions.
Methods
T2 was fielded (5/18-6/9/2023) to neurologists (neuro), psychiatrists (psych), and nurse practitioners (NP)/physician assistants (PA) affiliated with neuro/psych practices who prescribed vesicular monoamine transporter 2 inhibitors or benztropine for DIMD in the past 6 months and saw ≥15% of patients via telehealth at peak and post-COVID.
Results
100 neuros, 100 psychs, and 105 NP/PAs responded. More patients were seen in-person post-COVID (12-27% vs 31-53%), but percentage seen by video remained largely unchanged (54-62% vs 37-53%). Issues influencing appointment setting in T2 remained access to care, technology, and digital literacy although T2 clinicians reported less patients had issues connecting for a video visit. In T2, clinicians used multiple telehealth methods to evaluate DIMDs including personal phone videos (48-66%), telemedicine apps (36-45%), health/fitness trackers (6-13%), and other (2-5%). Common T2 diagnostic telehealth issues included determining signs of difficulty with gait, falls, walking, and standing; difficulty writing, using phone, computer; and painful movements. In patients evaluated for DIMD, more received an eventual diagnosis in T2 vs T1 both in-person (34-53% vs 26-46%) and video (32-51% vs 29-44%) but, on average, neuros and psychs required 1 more telehealth visit to confirm a DIMD diagnosis vs in-person. Over half of clinicians on average recommended patients come in-person to confirm a DIMD diagnosis. Most clinicians reported ongoing difficultly diagnosing patients via phone. Neuros were less comfortable than psychs/NP/PAs with telehealth visits due to risk of misdiagnosis and liability. While all clinicians saw telehealth advantages, neuros expect to see more of their patients in person post-COVID. However, in T2, the number of clinicians who found it difficult to manage DIMDs cases by video had significantly decreased (T1 52-54%; T2 28-36%). Half of clinicians reported the non-presence of a caregiver as a significant barrier to diagnosis and treatment via telehealth. Clear guidelines and provider education were the most feasible strategies to implement to improve telehealth quality of care.
Conclusions
T2 clinicians are more comfortable managing DIMDs via telehealth but require ˜1 more visit to confirm a diagnosis vs in-person. Significant barriers to telehealth remain including digital literacy, inconsistent caregiver presence, and lack of clear diagnosis guidelines. Clinicians see value in telehealth but it is still not as effective as in-person. Significantly more clinicians are in-office post-COVID and >50% recommend patients come in-person to confirm a DIMD diagnosis.
This study aimed to explore the genetic variability present in tamarind fruits. A survey and collection of twenty-nine tamarind accessions from the Bastar region of Chhattisgarh was conducted, focusing on morphological traits, biochemical properties, and mineral content. The analysis revealed significant variation in fruit characteristics, including pod weight (91.1–528.3 g), pod length (4.11–15.39 cm), pulp weight (32.88–275.68 g), number of seeds (26–237), seed weight (23.14–214.08 g), pulp percentage (26.43–52.18%), vitamin C content (54.5–92 mg/100 g), phenolic content (51.53–296.4 mg GAE/g fw), flavonoid content (75.91–280.88 mg QE/ 100 g fw), acidity (5.3–12.60%), reducing sugars (24.67–68.29%), total sugars (24.89–78.87%), calcium (0.15–1.28%), and iron content (26.6–125.7 ppm) across different accessions. Based on the overall evaluation, five accessions B21, B26, B15, B25, and B7 with the best combination of desirable fruit traits, were identified as the most promising. Additionally, five sweet accessions with acidity levels below 6% were identified (B26, B21, B15, B12, B11). Principal component analysis (PCA) was applied, identifying five principal components that accounted for 86.73% of the total variability. Correlation analysis showed a significant positive relationship between pod weight and pulp weight (r = 0.93), shell weight (r = 0.70), number of seeds (r = 0.89), and seed weight (r = 0.89). The biplot of PC1 and PC2 illustrated the distribution of accessions across all four quadrants, with B27, B8, B26, B29, B14, B18, and B13 displaying distinct differences from one another.
Clinical trials often struggle to recruit enough participants, with only 10% of eligible patients enrolling. This is concerning for conditions like stroke, where timely decision-making is crucial. Frontline clinicians typically screen patients manually, but this approach can be overwhelming and lead to many eligible patients being overlooked.
Methods:
To address the problem of efficient and inclusive screening for trials, we developed a matching algorithm using imaging and clinical variables gathered as part of the AcT trial (NCT03889249) to automatically screen patients by matching these variables with the trials’ inclusion and exclusion criteria using rule-based logic. We then used the algorithm to identify patients who could have been enrolled in six trials: EASI-TOC (NCT04261478), CATIS-ICAD (NCT04142125), CONVINCE (NCT02898610), TEMPO-2 (NCT02398656), ESCAPE-MEVO (NCT05151172), and ENDOLOW (NCT04167527). To evaluate our algorithm, we compared our findings to the number of enrollments achieved without using a matching algorithm. The algorithm’s performance was validated by comparing results with ground truth from a manual review of two clinicians. The algorithm’s ability to reduce screening time was assessed by comparing it with the average time used by study clinicians.
Results:
The algorithm identified more potentially eligible study candidates than the number of participants enrolled. It also showed over 90% sensitivity and specificity for all trials, and reducing screening time by over 100-fold.
Conclusions:
Automated matching algorithms can help clinicians quickly identify eligible patients and reduce resources needed for enrolment. Additionally, the algorithm can be modified for use in other trials and diseases.
Direct numerical simulations of the turbulence of a Herschel–Bulkley (HB) fluid in a rough channel are performed at a shear Reynolds number $Re_{\tau } \approx 300$ and a Bingham number ${Bn} \approx 0.9$. For the type of rough surface used in this study, the results indicate that Townsend's wall similarity hypothesis also holds for HB fluids. However, there are notable differences compared with the effect of roughness on Newtonian fluids. More specifically, the effect of roughness appears to be slightly stronger for HB fluids, in the sense that the bulk Reynolds number, based on the viscosity at the wall, is reduced further due to the increase in viscosity in the troughs of the roughness surface induced by the low shear. At the same time, for the simulated rough surface, the contribution of form drag to the total pressure drop is reduced from 1/4 to about 1/5 due to the persistence of viscous shear in the boundary layer, reducing its shielding effect. As for the friction factor, due to the nonlinearity of the HB constitutive relation, its use with the wall shear rate from the mean wall shear stress underpredicts the minimum viscosity at the wall by up to 18 %. This inevitably leads to uncertainties in the prediction of the friction factor. Finally, it is observed that the rough surface is unable to break the peculiar near-wall flow structure of HB fluids, which consists of long persistent low-speed streaks occupying the entire domain. This means that the small-scale energy is significantly reduced for HB fluids, even in rough channels, with the energy more concentrated in the lower wavenumber range, implying an increase in the slope of the power spectrum to $-7/2$ in the inertial range, as shown by Mitishita et al. (J. Non-Newtonian Fluid Mech., vol. 293, 2021, 104570).
Herbicides have been placed in global Herbicide Resistance Action Committee (HRAC) herbicide groups based on their sites of action (e.g., acetolactate synthase–inhibiting herbicides are grouped in HRAC Group 2). A major driving force for this classification system is that growers have been encouraged to rotate or mix herbicides from different HRAC groups to delay the evolution of herbicide-resistant weeds, because in theory, all active ingredients within a herbicide group physiologically affect weeds similarly. Although herbicide resistance in weeds has been studied for decades, recent research on the biochemical and molecular basis for resistance has demonstrated that patterns of cross-resistance are usually quite complicated and much more complex than merely stating, for example, a certain weed population is Group 2-resistant. The objective of this review article is to highlight and describe the intricacies associated with the magnitude of herbicide resistance and cross-resistance patterns that have resulted from myriad target-site and non–target site resistance mechanisms in weeds, as well as environmental and application timing influences. Our hope is this review will provide opportunities for students, growers, agronomists, ag retailers, regulatory personnel, and research scientists to better understand and realize that herbicide resistance in weeds is far more complicated than previously considered when based solely on HRAC groups. Furthermore, a comprehensive understanding of cross-resistance patterns among weed species and populations may assist in managing herbicide-resistant biotypes in the short term by providing growers with previously unconsidered effective control options. This knowledge may also inform agrochemical company efforts aimed at developing new resistance-breaking chemistries and herbicide mixtures. However, in the long term, nonchemical management strategies, including cultural, mechanical, and biological weed management tactics, must also be implemented to prevent or delay increasingly problematic issues with weed resistance to current and future herbicides.
Inadequate response to first- and second-line pharmacological treatments for psychiatric disorders is commonly observed. Ketamine has demonstrated efficacy in treating adults with treatment-resistant depression (TRD), with additional off-label benefits reported for various psychiatric disorders. Herein, we performed a systematic review and meta-analysis to examine the therapeutic applications of ketamine across multiple mental disorders, excluding mood disorders.
Methods
We conducted a multidatabase literature search of randomized controlled trials and open-label trials investigating the therapeutic use of ketamine in treating mental disorders. Studies utilizing the same psychological assessments for a given disorder were pooled using the generic inverse variance method to generate a pooled estimated mean difference.
Results
The search in OVID (MedLine, Embase, AMED, PsychINFO, JBI EBP Database), EBSCO CINAHL Plus, Scopus, and Web of Science yielded 44 studies. Ketamine had a statistically significant effect on PTSD Checklist for DSM-5 (PCL-5) scores (pooled estimate = ‒28.07, 95% CI = [‒40.05, ‒16.11], p < 0.001), Clinician-Administered PTSD Scale for DSM-5 (CAPS-5) scores (pooled estimate = ‒14.07, 95% CI = [‒26.24, ‒1.90], p = 0.023), and Yale-Brown Obsessive Compulsive Scale (Y-BOCS) scores (pooled estimate = ‒8.08, 95% CI = [‒13.64, ‒2.52], p = 0.004) in individuals with PTSD, treatment-resistant PTSD (TR-PTSD), and obsessive compulsive disorder (OCD), respectively. For alcohol use disorders and at-risk drinking, there was disproportionate reporting of decreased urge to drink, increased rate of abstinence, and longer time to relapse following ketamine treatment.
Conclusions
Extant literature supports the potential use of ketamine for the treatment of PTSD, OCD, and alcohol use disorders with significant improvement of patient symptoms. However, the limited number of randomized controlled trials underscores the need to further investigate the short- and long-term benefits and risks of ketamine for the treatment of psychiatric disorders.
Drip irrigation and mulching were tested to minimize unproductive water loss through evaporation and weed interference. A field experiment was conducted during spring season of 2020 and 2021 in split plot design with three replications. The study includes six treatment combinations of drip irrigation methods (surface drip and subsurface drip irrigation) and mulching (black plastic, paddy straw and no mulch) along with one conventional furrow irrigation without mulching (as control) in main plots. Four weed control treatments (atrazine 1000 g a.i./ha as pre-emergence, two hand weedings at 30 and 60 days after sowing [DAS], weed free and weedy for whole crop growth period) were kept in the subplots. The combination of drip irrigation and mulches significantly enhanced leaf area index and crop biomass at 60 DAS than furrow irrigation. Integration of subsurface drip irrigation with plastic mulching resulted in the lowest weed density and biomass among main plots. Drip irrigation coupled with plastic and straw mulching resulted in 86 and 50% reduction in weed density and biomass, respectively, as compared to no mulching. Integration of subsurface drip with paddy straw mulch and black plastic mulch resulted in 17.1 and 15.5% higher maize grain yield, respectively, as compared to furrow irrigation. The highest irrigation water productivity (3.58 kg/m3) was observed in combination of subsurface drip and paddy straw mulch followed by combination of subsurface drip and black plastic mulch (3.51 kg/m3). Overall, straw mulching in drip irrigation system proved economical in terms of maize productivity.
Volunteer corn (Zea mays L.) is a competitive weed in corn-based cropping systems. Scientific literature does not exist about the water use of volunteer corn grown in different crops and irrigation systems. The objectives of this study were to characterize the growth and evapotranspiration (ETa) of volunteer corn in corn, soybean [Glycine max (L). Merr.], and sorghum [Sorghum bicolor (L.) Moench] under center-pivot irrigation (CPI) and subsurface drip irrigation (SDI) systems. Field experiments were conducted in south-central Nebraska in 2021 and 2022. Soil moisture sensors were installed at depths of 0 to 0.30, 0.30 to 0.60, and 0.60 to 0.90 m to track soil water balance and quantify seasonal total ETa. Corn was the most competitive, as volunteer corn had the lowest biomass, leaf area, and plant height compared with the fallow. Soybean was the least competitive with volunteer corn, as the plant height, biomass, and leaf area of volunteer corn in soybean were similar to fallow at 15, 30, 45, and 60 d after transplanting (DATr). Averaged across crop treatments, irrigation type did not affect volunteer corn growth at 15 to 45 DATr. Soil water depletion and ETa were similar across crop treatments with and without volunteer corn, as water was not a limiting factor in this study. The ETa of volunteer corn was the highest in soybean (623 mm), followed by sorghum (622 mm), and corn (617 mm) under CPI. The SDI had higher irrigation efficiency, because without affecting crop yield, it had 3%, 6%, and 8% lower ETa in soybean (605 mm), sorghum (585 mm), and corn (571 mm), respectively. Although soil water use did not differ with volunteer corn infestation, a soybean yield loss of 27% was observed, which suggests that volunteer corn may not compete for moisture under fully irrigated conditions; however, it can impact the crop yield potential due to competition for factors other than soil moisture.
Benghal dayflower and sicklepod are weeds of economic importance in peanut in the southeastern United States due to their extended emergence pattern and limited effective herbicides for control. Field studies were conducted near Jay, Florida, in 2022 and 2023, to evaluate the effect of planting date and herbicide combinations on Benghal dayflower and sicklepod control in peanut crops. Peanut planted in June was exposed to a higher Benghal dayflower density than peanut planted in May. Sicklepod density was similar between May and June planting dates at 28 d after preemergence and early postemergence herbicide applications, but density was greater in peanut that was planted in June, 28 d after the mid-postemergence application. A preemeergence herbicide application followed by (fb) an early postemergence application of S-metolachlor or diclosulam + S-metolachlor controlled Benghal dayflower 84% to 93% 28 d after early postemergence in peanut that was planted in May, but control was reduced to 58% to 78% in the crop that had been planted in June. Regardless of planting date, a preemeergence application fb S-metolachlor or diclosulam + S-metolachlor applied early postemergence provided <80% sicklepod control 28 d after early postemergence. Imazapic + dimethenamid-P + 2,4-DB applied postemergence improved Benghal dayflower control to at least 94% 28 d after mid-postemergence, but sicklepod control was not >85%. Regardless of the planting date, paraquat + bentazon + S-metolachlor applied early postemergence was required to achieve ≥95% sicklepod control. However, herbicide combinations that included paraquat + bentazon + S-metolachlor reduced peanut yield when planting was delayed to June. In fields that are infested with Benghal dayflower and sicklepod, it is recommended that peanut be planted in early May to minimize the potential impact of these weeds and to increase peanut yield. Late-planted peanut required more intensive herbicide applications to obtain the same peanut yield as the May-planted peanut.
This manuscript addresses a critical topic: navigating complexities of conducting clinical trials during a pandemic. Central to this discussion is engaging communities to ensure diverse participation. The manuscript elucidates deliberate strategies employed to recruit minority communities with poor social drivers of health for participation in COVID-19 trials. The paper adopts a descriptive approach, eschewing analysis of data-driven efficacy of these efforts, and instead provides a comprehensive account of strategies utilized. The Accelerate COVID-19 Treatment Interventions and Vaccines (ACTIV) public–private partnership launched early in the COVID-19 pandemic to develop clinical trials to advance SARS-CoV-2 treatments. In this paper, ACTIV investigators share challenges in conducting research during an evolving pandemic and approaches selected to engage communities when traditional strategies were infeasible. Lessons from this experience include importance of community representatives’ involvement early in study design and implementation and integration of well-developed public outreach and communication strategies with trial launch. Centralization and coordination of outreach will allow for efficient use of resources and the sharing of best practices. Insights gleaned from the ACTIV program, as outlined in this paper, shed light on effective strategies for involving communities in treatment trials amidst rapidly evolving public health emergencies. This underscores critical importance of community engagement initiatives well in advance of the pandemic.
We have established trophoblast cell lines, from parthenogenesis-derived buffalo blastocysts. The buffalo trophoblast cells were cultured continuously over 200 days and 21 passages. These cells were observed by phase-contrast microscopy for their morphology and characterized by reverse transcriptase polymerase chain reaction and immunofluorescence against trophoblast-specific markers and cytoskeletal proteins. Trophoblast cells showed positive staining for CDX2, a marker of these cells at both blastocyst and cell line levels. Epithelial morphology of these cells was revealed by positive staining against cytokeratins and tubulin but not against vimentin and dolichos biflorus agglutinin. Gene expression profiles of many important placenta-specific genes were studied in the primary trophectoderm outgrowths, which were collected on days 0, 5, 9, 12 and 15 of culture and trophoblast cell line at passages 12–15. Therefore, the trophoblast cell line derived can potentially be used for in vitro studies on buffalo embryonic development.
Sicklepod is one of the most difficult to control weeds in peanut production in the southeastern United States due to its extended emergence pattern and limited effective herbicides for control. Growers rely on preemergence herbicides as the foundation of their weed control programs; however, postemergence herbicides are often needed for season-long weed control. The objectives of this study were to evaluate the effect of planting pattern and herbicide combinations for sicklepod control in peanut crops. Due to rapid canopy closure, twin-row planting improved late-season sicklepod control by 13% and peanut yield by 5% compared with a single-row pattern. A preemergence application of fluridone, flumioxazin, or fluridone + flumioxazin provided 76% to 89% control of sicklepod 28 d after preemergence. Regardless of the herbicide applied preemergence, paraquat + bentazon + S-metolachlor applied early postemergence was required to achieve ≥90% sicklepod control 28 d after early postemergence. All preemergence herbicide treatments followed by (fb) S-metolachlor or diclosulam + S-metolachlor applied early postemergence provided <90% control 28 d after early postemergence. A mid-postemergence application of imazapic + dimethenamid-P + 2,4-DB controlled sicklepod by 67% to 79% prior to peanut harvest, and biomass reduction was unacceptable (<80%), resulting in difficulty in peanut digging. The highest peanut yield was observed when paraquat + bentazon + S-metolachlor was applied early postemergence fb imazapic + dimethenamid-P + 2,4-DB applied mid-postemergence. Based on the results of this study, a herbicide combination of paraquat + bentazon + S-metolachlor is an important early-season tool for controlling sicklepod in peanut crops. The results also showed that a twin-row planting pattern improved late-season sicklepod control but did not reduce herbicide input to protect peanut yield.