We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Scientific Advisory Committee on Nutrition (SACN) provides independent advice on nutrition and related health matters to UK government organisations. In keeping with its commitment to openness and transparency, SACN follows a set ‘Framework’ to ensure a prescribed and consistent approach is taken in all its evidence evaluations. Following an update of the SACN Framework in 2020, which addressed some straightforward issues, the SACN Framework subgroup was established in 2021 to consider more complex matters that were not addressed in the 2020 update. The SACN Framework subgroup considered 4 main topics for update: 1) the different types of evidence evaluations produced by SACN, 2) interpretation of statistical data, 3) tools for assessment of study quality, 4) tools to assess the certainty of a body of evidence for exposure-outcome relationships. The Framework subgroup agreed clear definitions and processes for the different types of evidence evaluations produced by SACN and agreed that interpretation of p values should be informed by consideration of study size, power and methodological quality. The subgroup recommended use of the AMSTAR 2 tool for quality assessment of evidence from systematic reviews and use of the GRADE approach to assess the certainty of evidence. The updated Framework was published in January 2023. This was followed by publication of a further update in October 2024. As a ‘living’ document, the Framework will be subject to regular review by the Framework subgroup and continue to evolve in line with best practice.
Background: TeleStroke can improve access to stroke care in rural areas. We aim to evaluate the safety and effectiveness of intravenous thrombolysis in our TeleStroke system. Methods: The Manitoba TeleStroke program was rolled out across 7 sites between November 2014 and January 2019. We retrospectively analyzed prospectively collected consecutive acute stroke patients’ data in this duration. The primary outcome was safety and effectiveness measured in terms of 90-day modified Rankin score (mRs). The number of acute ischemic stroke (AIS) patients receiving thrombolysis and endovascular thrombectomy [EVT] and process metrics were also analyzed. R/RStudio version-4.3.2 was used (p<0.05). Results: Of the 1,748 TeleStroke patients (age 71 years [IQR 58-81], female 810[46.3%]), 696 were identified as AIS. Of these, 265(38.1%) received thrombolysis and 48(6.9%) EVT. Ninety-day mortality was 53(20.0%) among those receiving thrombolysis and 117(44.2%) had a favorable outcome (mRs ≤2). Of those who received intravenous thrombolysis, 9 patients (4.2%) were found to have symptomatic intracranial hemorrhage. The median last-seen-normal (LSN)-to-door was121 minutes and the median door-to-needle, 55 minutes. Conclusions: Intravenous thrombolysis was found to be effective with acceptable safety. TeleStroke improved overall access to stroke care and played an important role in identifying AIS patients eligible for thrombolysis and EVT.
The Indian Pulsar Timing Array (InPTA) employs unique features of the upgraded Giant Metrewave Radio Telescope (uGMRT) to monitor dozens of the International Pulsar Timing Array (IPTA) millisecond pulsars (MSPs), simultaneously in the 300-500 MHz and the 1260-1460 MHz bands. This dual-band approach ensures that any frequency-dependent delays are accurately characterized, significantly improving the timing precision for pulsar observations, which is crucial for pulsar timing arrays. We present details of InPTA’s second data release that involves 7 yrs of data on 27 IPTA MSPs. This includes sub-banded Times of Arrival (ToAs), Dispersion Measures (DM), and initial timing ephemerides for our MSPs. A part of this dataset, originally released in InPTA’s first data release, is being incorporated into IPTA’s third data release which is expected to detect and characterize nanohertz gravitational waves in the coming years. The entire dataset is reprocessed in this second data release providing some of the highest precision DM estimates so far and interesting solar wind related DM variations in some pulsars. This is likely to characterize the noise introduced by the dynamic inter-stellar ionised medium much better than the previous release thereby increasing sensitivity to any future gravitational wave search.
Background: Studies have found similar rates of functional independence for men and women after endovascular thrombectomy (EVT). Less is known regarding EVT-related procedural complications and symptomatic intracerebral hemorrhage (sICH) between sexes. Methods: Using the OPTIMISE registry including data from 20 comprehensive stroke centers across Canada between 1/1/2018 and 12/31/2022, we performed a retrospective descriptive analysis of patients divided between men and women. Hemorrhagic transformation on follow-up imaging with associated clinical deterioration was required to define sICH. Results: 3631 patients were included (1778 men and 1853 women) for analysis. Female patients were older (71.8±14.6 vs 68.0±13.1 years, p<0.001). There were no differences in sICH rates (2.5% men vs. 2% women, p= 0.388}. Procedural complication rates were not different between men and women (5.8 vs 5.6% p=0.76): dissection {26 (1.5%) vs. 30 (1.6%), p=0.804}, perforation {11 (0.6%) vs. 7 (0.4%), p=0.426}, embolization {25 (1.4%) vs. 25 (1.3%), p=0.996} and arterial access complications {45 (2.5%) vs. 43 (2.3%), p=0.761}. Conclusions: In this large multicentre registry of stroke patients undergoing EVT, men and women had similarly low and reassuring rates of sICH and procedural complications. This complements previous data showing similar functional outcomes for men and women after EVT.
Background: Anterior (ACS) and posterior circulation (PCS) stroke patients have different clinical presentations and prognoses, though both benefit from endovascular thrombectomy (EVT). We sought to determine whether ACS and PCS patients treated with EVT differed with regards to treatment metrics and functional outcomes. Methods: We retrospectively analysed theCanadian OPTIMISE registry which included data from 20 comprehensive stroke centers across Canada between January 1, 2018, and December 31, 2022. We performed a descriptive analysis of patients divided in two groups (ACS= carotid artery and its branches, PCS= vertebrobasilar system). Results: Of the 6391 patients included (5929 ACS and 462 PCS), PSC patients were younger (67 vs. 71.3, p<0.001), more often male (61.9% vs. 48.6%, p<0.001), had longer (in minutes) onset-to-door (362 vs. 256, p<0.001), door-to-needle (172 vs. 144, p=0.0016), and onset-to-puncture (459 vs. 329, p<0.001) times. They were less often thrombolyzed (39.8% vs. 50.4%, p<0.001), and more frequently underwent general anesthesia (47.6% vs. 10.6%, p<0.001). Successful reperfusion and functional independence at 90 days were similar between the two groups. Conclusions: Patients with PCS had worst treatment metrics than ACS. Strategies to improve PCS management times are critical to decrease these disparities, including faster pre-hospital recognition and in-hospital workflows.
This study evaluated the impact of four cover crop species and their termination timings on cover crop biomass, weed control, and corn yield. A field experiment was arranged in a split-plot design in which cover crop species (wheat, cereal rye, hairy vetch, and rapeseed) were the main plot factor, and termination timings [4, 2, 1, and 0 wk before planting corn (WBP)] was the subplot factor. In both years (2021 and 2022), hairy vetch produced the most biomass (5,021 kg ha–1) among cover crop species, followed by cereal rye (4,387 kg ha–1), wheat (3,876 kg ha–1), and rapeseed (2,575 kg ha–1). Regression analysis of cover crop biomass with accumulated growing degree days (AGDDs) indicated that for every 100 AGDD increase, the biomass of cereal rye, wheat, hairy vetch, and rapeseed increased by 880, 670, 780, and 620 kg ha–1, respectively. The density of grass and small-seeded broadleaf (SSB) weeds at 4 wk after preemergence herbicide (WAPR) application varied significantly across termination timings. The grass and SSB weed densities were 56% and 36% less at 0 WBP compared with 2 WBP, and 67% and 61% less compared with 4 WBP. The sole use of a roller-crimper did not affect the termination of rapeseed at 0 WBP and resulted in the least corn yield (3,046 kg ha–1), whereas several different combinations of cover crops and termination timings resulted in greater corn yield. In conclusion, allowing cover crops to grow longer in the spring offers more biomass for weed suppression and impacts corn yield.
Exploration expeditions were conducted for 2 consecutive years in the subtropical region of North India to collect the untapped genetic diversity of Bael. A total of 15 accessions having unique traits of horticultural importance were collected and conserved in the field gene bank. Conserved germplasm was characterized for 3 consecutive years. Considerable variability was found in the morphological characters and biochemical traits. Fruit length ranged from 10.15 to 17. 68 cm, fruit circumference varied from 33.45 to 56.32 cm and fruit weight varied from 0.71 to 2.48 kg. Shell thickness was found to vary from 2.11 to 3.62 mm, whereas shell weight varied from 230 to 580 g/fruit. Number of seed sacs per fruit was found to vary from 11.17 to 15.72 and number of seeds per fruit varied from 68.00 to 113.17. Minimum seed weight was 7.04 g/fruit, whereas maximum 14.55 g/fruit. Ample variability was found in fruit yield of collected germplasm which ranged from 18.85 to 39.26 kg per plant at 16–18 years of age. Distinctive variability in biochemical traits was also found. Total soluble solids in fruit pulp were 34.92–41.13% Brix, total sugars 11.49–22.16%, acidity 0.36–0.53%, vitamin ‘C’ 9.89–17.20 mg/100 g, total carotenoids 1.43–2.40 mg/100 g and total tannins 2.50–3.58%. Available genetic diversity may be utilized for crop improvement programme.
Evaluate impact of COVID-19 prevention training with video-based feedback on nursing home (NH) staff safety behaviors.
Design:
Public health intervention
Setting & Participants:
Twelve NHs in Orange County, California, 6/2020-4/2022
Methods:
NHs received direct-to-staff COVID-19 prevention training and weekly feedback reports with video montages about hand hygiene, mask-wearing, and mask/face-touching. One-hour periods of recorded streaming video from common areas (breakroom, hallway, nursing station, entryway) were sampled randomly across days of the week and nursing shifts for safe behavior. Multivariable models assessed the intervention impact.
Results:
Video auditing encompassed 182,803 staff opportunities for safe behavior. Hand hygiene errors improved from first (67.0%) to last (35.7%) months of the intervention, decreasing 7.6% per month (OR = 0.92, 95% CI = 0.92–0.93, P < 0.001); masking errors improved from first (10.3 %) to last (6.6%) months of the intervention, decreasing 2.3% per month (OR = 0.98, 95% CI = 0.97–0.99, P < 0.001); face/mask touching improved from first (30.0%) to last (10.6%) months of the intervention, decreasing 2.5% per month (OR = 0.98, 95% CI = 0.97–0.98, P < 0.001). Hand hygiene errors were most common in entryways and on weekends, with similar rates across shifts. Masking errors and face/mask touching errors were most common in breakrooms, with the latter occurring most commonly during the day (7A.M.–3P.M.) shift, with similar rates across weekdays/weekends. Error reductions were seen across camera locations, days of the week, and nursing shifts, suggesting a widespread benefit within participating NHs.
Conclusion:
Direct-to-staff training with video-based feedback was temporally associated with improved hand hygiene, masking, and face/mask-touching behaviors among NH staff during the COVID-19 pandemic.
Herbaceous perennials must annually rebuild the aboveground photosynthetic architecture from carbohydrates stored in crowns, rhizomes, and roots. Knowledge of carbohydrate utilization and storage can inform management decisions and improve control outcomes for invasive perennials. We monitored the nonstructural carbohydrates in a population of the hybrid Bohemian knotweed [Polygonum ×bohemicum (J. Chrtek & Chrtková) Zika & Jacobson [cuspidatum × sachalinense]; syn.: Fallopia ×bohemica (Chrtek and Chrtková) J.P. Bailey] and in Japanese knotweed [Polygonum cuspidatum Siebold & Zucc.; syn.: Fallopia japonica (Houtt.) Ronse Decr.]. Carbohydrate storage in crowns followed seasonal patterns typical of perennial herbaceous dicots corresponding to key phenological events. Starch was consistently the highest nonstructural carbohydrate present. Sucrose levels did not show a consistent inverse relationship with starch levels. Lateral distribution of starch in rhizomes and, more broadly, total nonstructural carbohydrates sampled before dormancy break showed higher levels in rhizomes compared with crowns. Total nonstructural carbohydrate levels in crowns reached seasonal lows at an estimated 22.6% of crown dry weight after accumulating 1,453.8 growing degree days (GDD) by the end of June, mainly due to depleted levels of stored starch, with the estimated minimum of 12.3% reached by 1,220.3 GDD accumulated by mid-June. Depletion corresponded to rapid development of vegetative canopy before entering the reproductive phase in August. Maximum starch accumulation in crowns followed complete senescence of aboveground tissues by mid- to late October. Removal of aboveground shoot biomass in late June to early July with removal of vegetation regrowth in early September before senescence would optimize the use of time and labor to deplete carbohydrate reserves. Additionally, foliar-applied systemic herbicide translocation to belowground tissue should be maximized with applications in late August through early fall to optimize downward translocation with assimilate movement to rebuild underground storage reserves. Fall applications should be made before loss of healthy leaf tissue, with the window for control typically ending by late September in Minnesota.
We provide an assessment of the Infinity Two fusion pilot plant (FPP) baseline plasma physics design. Infinity Two is a four-field period, aspect ratio $A = 10$, quasi-isodynamic stellarator with improved confinement appealing to a max-$J$ approach, elevated plasma density and high magnetic fields ($ \langle B\rangle = 9$ T). Here $J$ denotes the second adiabatic invariant. At the envisioned operating point ($800$ MW deuterium-tritium (DT) fusion), the configuration has robust magnetic surfaces based on magnetohydrodynamic (MHD) equilibrium calculations and is stable to both local and global MHD instabilities. The configuration has excellent confinement properties with small neoclassical transport and low bootstrap current ($|I_{bootstrap}| \sim 2$ kA). Calculations of collisional alpha-particle confinement in a DT FPP scenario show small energy losses to the first wall (${\lt}1.5 \,\%$) and stable energetic particle/Alfvén eigenmodes at high ion density. Low turbulent transport is produced using a combination of density profile control consistent with pellet fueling and reduced stiffness to turbulent transport via three-dimensional shaping. Transport simulations with the T3D-GX-SFINCS code suite with self-consistent turbulent and neoclassical transport predict that the DT fusion power$P_{{fus}}=800$ MW operating point is attainable with high fusion gain ($Q=40$) at volume-averaged electron densities $n_e\approx 2 \times 10^{20}$ m$^{-3}$, below the Sudo density limit. Additional transport calculations show that an ignited ($Q=\infty$) solution is available at slightly higher density ($2.2 \times 10^{20}$ m$^{-3}$) with $P_{{fus}}=1.5$ GW. The magnetic configuration is defined by a magnetic coil set with sufficient room for an island divertor, shielding and blanket solutions with tritium breeding ratios (TBR) above unity. An optimistic estimate for the gas-cooled solid breeder designed helium-cooled pebble bed is TBR $\sim 1.3$. Infinity Two satisfies the physics requirements of a stellarator fusion pilot plant.
The selection, design and optimization of a suitable blanket configuration for an advanced high-field stellarator concept is seen as a key feasibility issue and has been incorporated as a vital and necessary part of the Infinity Two fusion pilot plant physics basis. The focus of this work was to identify a baseline blanket which can be rapidly deployed for Infinity Two while also maintaining flexibility and opportunities for higher performing concepts later in development. Results from this analysis indicate that gas-cooled solid breeder designs such as the helium-cooled pebble bed (HCPB) are the most promising concepts, primarily motivated by the neutronics performance at applicable blanket build depths, and the relatively mature technology basis. The lithium lead (PbLi) family of concepts, particularly the dual-cooled lithium lead, offer a compelling alternative to solid blanket concepts as they have synergistic developmental pathways while simultaneously mitigating much of the technical risk of those designs. Homogenized three-dimensional neutronics analysis of the Infinity Two configuration indicates that the HCPB achieves an adequate tritium breeding ratio (TBR) (1.30 which enables sufficient margin at low engineering fidelity), and near appropriate shielding of the magnets (average fast fluence of 1.3 ${\times}$ 10$^{18}$ n cm$^{-2}$ per full-power year). The thermal analysis indicates that reasonably high thermal efficiencies (greater than 30 %) are readily achievable with the HCPB paired with a simple Rankine cycle using reheat. Finally, the tritium fuel cycle analysis for Infinity Two shows viability, with anticipated operational inventories of less than one kilogram (approximately 675 g) and a required TBR (TBR$_{\textrm {req}}$) of less than 1.05 to maintain fuel self-sufficiency (approximately 1.023 for a driver blanket with no inventory doubling). Although further optimization and engineering design are still required, at the physics basis stage all initial targets have been met for the Infinity Two configuration.
Accountable care models for Medicaid reimbursement aim to improve care quality and reduce costs by linking payments to performance. Oregon’s coordinated care organizations (CCOs) assume financial responsibility for their members and are incentivized to help clinics improve performance on specific quality metrics. This study explores how Oregon’s CCO model influences partnerships between payers and primary care clinics, focusing on strategies used to enhance screening and treatment for unhealthy alcohol use (UAU).
Methods:
In this qualitative study, we conducted semi-structured interviews with informants from 12 of 13 Oregon CCOs active in 2019 and 2020. The interviews focused on payer–provider partnerships, specifically around UAU screening and treatment, which is a long-standing CCO metric. We used thematic analysis to identify key themes and causal-loop diagramming to uncover feedback dynamics and communicate key findings. Meadows’ leverage point framework was applied to categorize findings based on their potential to drive change.
Results:
CCO strategies to support clinics included building relationships, reporting on metric progress, providing EHR technical assistance, offering training, and implementing alternative payment methods. CCOs prioritized clinics with more members and those highly motivated. Our analysis showed that while the CCO model aligned goals between payers and clinics, it may perpetuate rural disparities by prioritizing larger, better-resourced clinics.
Conclusions:
Oregon’s CCO model fosters partnerships centered on quality metrics but may unintentionally reinforce rural disparities by incentivizing support for larger clinics. Applying the Meadows framework highlighted leverage points within these partnerships.
Primary hyperparathyroidism (PHPT) is the presence of hypercalcaemia with an elevated or inappropriately normal parathyroid hormone level. In clinical psychiatry this is often detected on routine blood investigations. This article aims to help mental health professionals understand the relevance of PHPT to psychiatry and offers some guidance about further management of patients presenting with this endocrine abnormality in mental health settings. PHPT can be associated with both mental and physical health problems in some individuals, making it a crucial diagnosis that should not be overlooked.
Since COVID-19, mental healthcare telehealth services have increased. A 2021 online survey (TeleSCOPE 1.0 [T1]) identified challenges evaluating, diagnosing, and monitoring DIMDs with telehealth (via video or phone). TeleSCOPE 2.0 (T2) was conducted to understand the telehealth impact post-COVID restrictions.
Methods
T2 was fielded (5/18-6/9/2023) to neurologists (neuro), psychiatrists (psych), and nurse practitioners (NP)/physician assistants (PA) affiliated with neuro/psych practices who prescribed vesicular monoamine transporter 2 inhibitors or benztropine for DIMD in the past 6 months and saw ≥15% of patients via telehealth at peak and post-COVID.
Results
100 neuros, 100 psychs, and 105 NP/PAs responded. More patients were seen in-person post-COVID (12-27% vs 31-53%), but percentage seen by video remained largely unchanged (54-62% vs 37-53%). Issues influencing appointment setting in T2 remained access to care, technology, and digital literacy although T2 clinicians reported less patients had issues connecting for a video visit. In T2, clinicians used multiple telehealth methods to evaluate DIMDs including personal phone videos (48-66%), telemedicine apps (36-45%), health/fitness trackers (6-13%), and other (2-5%). Common T2 diagnostic telehealth issues included determining signs of difficulty with gait, falls, walking, and standing; difficulty writing, using phone, computer; and painful movements. In patients evaluated for DIMD, more received an eventual diagnosis in T2 vs T1 both in-person (34-53% vs 26-46%) and video (32-51% vs 29-44%) but, on average, neuros and psychs required 1 more telehealth visit to confirm a DIMD diagnosis vs in-person. Over half of clinicians on average recommended patients come in-person to confirm a DIMD diagnosis. Most clinicians reported ongoing difficultly diagnosing patients via phone. Neuros were less comfortable than psychs/NP/PAs with telehealth visits due to risk of misdiagnosis and liability. While all clinicians saw telehealth advantages, neuros expect to see more of their patients in person post-COVID. However, in T2, the number of clinicians who found it difficult to manage DIMDs cases by video had significantly decreased (T1 52-54%; T2 28-36%). Half of clinicians reported the non-presence of a caregiver as a significant barrier to diagnosis and treatment via telehealth. Clear guidelines and provider education were the most feasible strategies to implement to improve telehealth quality of care.
Conclusions
T2 clinicians are more comfortable managing DIMDs via telehealth but require ˜1 more visit to confirm a diagnosis vs in-person. Significant barriers to telehealth remain including digital literacy, inconsistent caregiver presence, and lack of clear diagnosis guidelines. Clinicians see value in telehealth but it is still not as effective as in-person. Significantly more clinicians are in-office post-COVID and >50% recommend patients come in-person to confirm a DIMD diagnosis.
This study aimed to explore the genetic variability present in tamarind fruits. A survey and collection of twenty-nine tamarind accessions from the Bastar region of Chhattisgarh was conducted, focusing on morphological traits, biochemical properties, and mineral content. The analysis revealed significant variation in fruit characteristics, including pod weight (91.1–528.3 g), pod length (4.11–15.39 cm), pulp weight (32.88–275.68 g), number of seeds (26–237), seed weight (23.14–214.08 g), pulp percentage (26.43–52.18%), vitamin C content (54.5–92 mg/100 g), phenolic content (51.53–296.4 mg GAE/g fw), flavonoid content (75.91–280.88 mg QE/ 100 g fw), acidity (5.3–12.60%), reducing sugars (24.67–68.29%), total sugars (24.89–78.87%), calcium (0.15–1.28%), and iron content (26.6–125.7 ppm) across different accessions. Based on the overall evaluation, five accessions B21, B26, B15, B25, and B7 with the best combination of desirable fruit traits, were identified as the most promising. Additionally, five sweet accessions with acidity levels below 6% were identified (B26, B21, B15, B12, B11). Principal component analysis (PCA) was applied, identifying five principal components that accounted for 86.73% of the total variability. Correlation analysis showed a significant positive relationship between pod weight and pulp weight (r = 0.93), shell weight (r = 0.70), number of seeds (r = 0.89), and seed weight (r = 0.89). The biplot of PC1 and PC2 illustrated the distribution of accessions across all four quadrants, with B27, B8, B26, B29, B14, B18, and B13 displaying distinct differences from one another.
Clinical trials often struggle to recruit enough participants, with only 10% of eligible patients enrolling. This is concerning for conditions like stroke, where timely decision-making is crucial. Frontline clinicians typically screen patients manually, but this approach can be overwhelming and lead to many eligible patients being overlooked.
Methods:
To address the problem of efficient and inclusive screening for trials, we developed a matching algorithm using imaging and clinical variables gathered as part of the AcT trial (NCT03889249) to automatically screen patients by matching these variables with the trials’ inclusion and exclusion criteria using rule-based logic. We then used the algorithm to identify patients who could have been enrolled in six trials: EASI-TOC (NCT04261478), CATIS-ICAD (NCT04142125), CONVINCE (NCT02898610), TEMPO-2 (NCT02398656), ESCAPE-MEVO (NCT05151172), and ENDOLOW (NCT04167527). To evaluate our algorithm, we compared our findings to the number of enrollments achieved without using a matching algorithm. The algorithm’s performance was validated by comparing results with ground truth from a manual review of two clinicians. The algorithm’s ability to reduce screening time was assessed by comparing it with the average time used by study clinicians.
Results:
The algorithm identified more potentially eligible study candidates than the number of participants enrolled. It also showed over 90% sensitivity and specificity for all trials, and reducing screening time by over 100-fold.
Conclusions:
Automated matching algorithms can help clinicians quickly identify eligible patients and reduce resources needed for enrolment. Additionally, the algorithm can be modified for use in other trials and diseases.
Direct numerical simulations of the turbulence of a Herschel–Bulkley (HB) fluid in a rough channel are performed at a shear Reynolds number $Re_{\tau } \approx 300$ and a Bingham number ${Bn} \approx 0.9$. For the type of rough surface used in this study, the results indicate that Townsend's wall similarity hypothesis also holds for HB fluids. However, there are notable differences compared with the effect of roughness on Newtonian fluids. More specifically, the effect of roughness appears to be slightly stronger for HB fluids, in the sense that the bulk Reynolds number, based on the viscosity at the wall, is reduced further due to the increase in viscosity in the troughs of the roughness surface induced by the low shear. At the same time, for the simulated rough surface, the contribution of form drag to the total pressure drop is reduced from 1/4 to about 1/5 due to the persistence of viscous shear in the boundary layer, reducing its shielding effect. As for the friction factor, due to the nonlinearity of the HB constitutive relation, its use with the wall shear rate from the mean wall shear stress underpredicts the minimum viscosity at the wall by up to 18 %. This inevitably leads to uncertainties in the prediction of the friction factor. Finally, it is observed that the rough surface is unable to break the peculiar near-wall flow structure of HB fluids, which consists of long persistent low-speed streaks occupying the entire domain. This means that the small-scale energy is significantly reduced for HB fluids, even in rough channels, with the energy more concentrated in the lower wavenumber range, implying an increase in the slope of the power spectrum to $-7/2$ in the inertial range, as shown by Mitishita et al. (J. Non-Newtonian Fluid Mech., vol. 293, 2021, 104570).
Herbicides have been placed in global Herbicide Resistance Action Committee (HRAC) herbicide groups based on their sites of action (e.g., acetolactate synthase–inhibiting herbicides are grouped in HRAC Group 2). A major driving force for this classification system is that growers have been encouraged to rotate or mix herbicides from different HRAC groups to delay the evolution of herbicide-resistant weeds, because in theory, all active ingredients within a herbicide group physiologically affect weeds similarly. Although herbicide resistance in weeds has been studied for decades, recent research on the biochemical and molecular basis for resistance has demonstrated that patterns of cross-resistance are usually quite complicated and much more complex than merely stating, for example, a certain weed population is Group 2-resistant. The objective of this review article is to highlight and describe the intricacies associated with the magnitude of herbicide resistance and cross-resistance patterns that have resulted from myriad target-site and non–target site resistance mechanisms in weeds, as well as environmental and application timing influences. Our hope is this review will provide opportunities for students, growers, agronomists, ag retailers, regulatory personnel, and research scientists to better understand and realize that herbicide resistance in weeds is far more complicated than previously considered when based solely on HRAC groups. Furthermore, a comprehensive understanding of cross-resistance patterns among weed species and populations may assist in managing herbicide-resistant biotypes in the short term by providing growers with previously unconsidered effective control options. This knowledge may also inform agrochemical company efforts aimed at developing new resistance-breaking chemistries and herbicide mixtures. However, in the long term, nonchemical management strategies, including cultural, mechanical, and biological weed management tactics, must also be implemented to prevent or delay increasingly problematic issues with weed resistance to current and future herbicides.
Inadequate response to first- and second-line pharmacological treatments for psychiatric disorders is commonly observed. Ketamine has demonstrated efficacy in treating adults with treatment-resistant depression (TRD), with additional off-label benefits reported for various psychiatric disorders. Herein, we performed a systematic review and meta-analysis to examine the therapeutic applications of ketamine across multiple mental disorders, excluding mood disorders.
Methods
We conducted a multidatabase literature search of randomized controlled trials and open-label trials investigating the therapeutic use of ketamine in treating mental disorders. Studies utilizing the same psychological assessments for a given disorder were pooled using the generic inverse variance method to generate a pooled estimated mean difference.
Results
The search in OVID (MedLine, Embase, AMED, PsychINFO, JBI EBP Database), EBSCO CINAHL Plus, Scopus, and Web of Science yielded 44 studies. Ketamine had a statistically significant effect on PTSD Checklist for DSM-5 (PCL-5) scores (pooled estimate = ‒28.07, 95% CI = [‒40.05, ‒16.11], p < 0.001), Clinician-Administered PTSD Scale for DSM-5 (CAPS-5) scores (pooled estimate = ‒14.07, 95% CI = [‒26.24, ‒1.90], p = 0.023), and Yale-Brown Obsessive Compulsive Scale (Y-BOCS) scores (pooled estimate = ‒8.08, 95% CI = [‒13.64, ‒2.52], p = 0.004) in individuals with PTSD, treatment-resistant PTSD (TR-PTSD), and obsessive compulsive disorder (OCD), respectively. For alcohol use disorders and at-risk drinking, there was disproportionate reporting of decreased urge to drink, increased rate of abstinence, and longer time to relapse following ketamine treatment.
Conclusions
Extant literature supports the potential use of ketamine for the treatment of PTSD, OCD, and alcohol use disorders with significant improvement of patient symptoms. However, the limited number of randomized controlled trials underscores the need to further investigate the short- and long-term benefits and risks of ketamine for the treatment of psychiatric disorders.