We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Vitamin D deficiency is highly prevalent in the UK (1-2). Low exposure to the sun in winter months, as well as higher risk of deficiency amongst some ethnic minority populations (1), means that fortification of food and beverages remains an important potential route to ensure optimal vitamin D status. However, it is unclear as to whether type of fortified food affects ability to raise vitamin D status. Animal foods (e.g. dairy foods) would be expected to lead to higher vitamin D absorption than would non-animal-based foods (e.g. bread, juice), due to their higher fat content. The primary aim of this systematic review and meta-analysis was to investigate the effectiveness of animal and non-animal-based vitamin D fortified foods on raising serum 25-hydroxyvitamin D (25(OH)D).
The literature search was conducted using PubMed on 23 January 2024. Inclusion criteria were as follows: data on non-pregnant/non-lactating adults or data on children, randomised controlled trial; data for 25(OH)D measurement. Initial search results retrieved 701 publications, and 593 ineligible records were removed. Next, 108 records were screened by title and abstract, with 63 records excluded, for the following reasons: off topic (n=54); pregnant or breastfeeding (n=6); non-human (n=1); preterm infants (n=1) and duration <4 weeks (n=1). After full text eligibility screening, 28 publications remained for systematic review and meta-analysis. Ethical approval was not required as this was a literature review.
The end point data meta-analysis showed (for all studies combined) a significant increase in 25(OH)D (+23.4 (95% CI 17.0, 29.7) nmol/L (24 studies)). For specific food types, results were as follows: ‘animal’ +21.7 (95% CI 14.1, 29.3) nmol/L (17 studies); mixture of ‘animal’ and ‘non-animal’ +26.1 (95% CI 10.8, 41.4) nmol/L (1 study); ‘non-animal’ +28.1 (95% 12.0, 44.2) nmol/L (6 studies).
Contrary to what would be expected, non-animal mode of fortification (e.g. bread, juice) had a similar effect size to animal modes (e.g. dairy), so can be considered equivalent in effectiveness in raising 25(OH)D concentration. Differences in dose, duration and population groups between the non-animal and animal modes (in terms of health and baseline vitamin D status) mean the results should be taken with caution, and future studies where these factors are standardised could be useful to provide further evidence of effectiveness.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
We describe caudal analgesia agent, dose, reported adverse events, and outcomes in a single-centre, retrospective cohort study of 200 patients undergoing cardiac surgery from October 2020 to April 2023. Median (interquartile range) doses of clonidine and morphine were 2.7 (2.1–3) mCg/kg and 0.12 (0.1–1.12) mg/kg, respectively. Our findings suggest that a clonidine/morphine caudal was tolerated in cardiac surgical patients.
Multicenter clinical trials are essential for evaluating interventions but often face significant challenges in study design, site coordination, participant recruitment, and regulatory compliance. To address these issues, the National Institutes of Health’s National Center for Advancing Translational Sciences established the Trial Innovation Network (TIN). The TIN offers a scientific consultation process, providing access to clinical trial and disease experts who provide input and recommendations throughout the trial’s duration, at no cost to investigators. This approach aims to improve trial design, accelerate implementation, foster interdisciplinary teamwork, and spur innovations that enhance multicenter trial quality and efficiency. The TIN leverages resources of the Clinical and Translational Science Awards (CTSA) program, complementing local capabilities at the investigator’s institution. The Initial Consultation process focuses on the study’s scientific premise, design, site development, recruitment and retention strategies, funding feasibility, and other support areas. As of 6/1/2024, the TIN has provided 431 Initial Consultations to increase efficiency and accelerate trial implementation by delivering customized support and tailored recommendations. Across a range of clinical trials, the TIN has developed standardized, streamlined, and adaptable processes. We describe these processes, provide operational metrics, and include a set of lessons learned for consideration by other trial support and innovation networks.
Following the recent report of strongyloidiasis caused by Strongyloides fuelleborni within a semi-captive colony of baboons in a UK safari park, we investigated the genetic relationships of this isolate with other Strongyloides isolates across the world. Whole-genome sequencing data were generated with later phylogenetic analysis of mitochondrial (mt) cytochrome oxidase subunit 1 (cox1) and nuclear ribosomal 18S sequences against 300 published Strongyloides reference genotypes. The putative African origin of the UK S. fuelleborni was confirmed and full-length mt genome sequences were assembled to facilitate a more detailed phylogenetic analysis of 14 mt coding regions against all available Strongyloides species. Our analyses demonstrated that the UK isolate represented a novel African lineage not previously described. Additional complete mt genomes were assembled for several individual UK safari park worms to reveal a slightly altered mt genome gene arrangement, allowing clear separation from Asian S. fuelleborni. Furthermore, these UK worms possessed expanded intergenic regions of unknown function that increase their mt genome size to approximately 24 kilobases (kb) as compared with some 16 kb for Asian S. fuelleborni; this may have arisen from unique populational founder and genetic drift effects set within the peculiar mixed species baboon and drill ancestry of this semi-captive primate colony. A maximum likelihood phylogeny constructed from 14 mt coding regions also supported an evolutionary distinction between Asian and African S. fuelleborni.
We undertake a comprehensive investigation into the distribution of in situ stars within Milky Way-like galaxies, leveraging TNG50 simulations and comparing their predictions with data from the H3 survey. Our analysis reveals that 28% of galaxies demonstrate reasonable agreement with H3, while only 12% exhibit excellent alignment in their profiles, regardless of the specific spatial cut employed to define in situ stars. To uncover the underlying factors contributing to deviations between TNG50 and H3 distributions, we scrutinise correlation coefficients among internal drivers (e.g. virial radius, star formation rate [SFR]) and merger-related parameters (such as the effective mass-ratio, mean distance, average redshift, total number of mergers, average spin-ratio, and maximum spin alignment between merging galaxies). Notably, we identify significant correlations between deviations from observational data and key parameters such as the median slope of virial radius, mean SFR values, and the rate of SFR change across different redshift scans. Furthermore, positive correlations emerge between deviations from observational data and parameters related to galaxy mergers. We validate these correlations using the Random Forest Regression method. Our findings underscore the invaluable insights provided by the H3 survey in unravelling the cosmic history of galaxies akin to the Milky Way, thereby advancing our understanding of galactic evolution and shedding light on the formation and evolution of Milky Way-like galaxies in cosmological simulations.
An estimated 129000 cases of Lyme borreliosis (LB) are reported annually in Europe. In 2022, we conducted a representative web-based survey of 28034 persons aged 18–65 years old in 20 European countries to describe tick and LB risk exposures and perceptions. Nearly all respondents (95.0%) were aware of ticks (range, 90.4% in the UK to 98.8% in Estonia). Among those aware of ticks, most (85.1%) were also aware of LB (range, 70.3% in Switzerland to 97.0% in Lithuania). Overall, 8.3% of respondents reported a past LB diagnosis (range, 3.0% in Romania to 13.8% in Sweden). Respondents spent a weekly median of 7 (interquartile range [IQR] 3–14) hours in green spaces at home and 9 (IQR 4–16) hours away from home during April–November. The most common tick prevention measures always or often used were checking for ticks (44.8%) and wearing protective clothing (40.2%). This large multicountry survey provided needed data that can be used to design targeted LB prevention programmes in Europe.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Two studies were conducted in 2022 and 2023 near Rocky Mount and Clayton, NC, to determine the optimal granular ammonium sulfate (AMS) rate and application timing for pyroxasulfone-coated AMS. In the rate study, AMS rates included 161, 214, 267, 321, 374, 428, and 481 kg ha−1, equivalent to 34, 45, 56, 67, 79, 90, and 101 kg N ha−1, respectively. All rates were coated with pyroxasulfone at 118 g ai ha−1 and topdressed onto 5- to 7-leaf cotton. In the timing study, pyroxasulfone (118 g ai ha−1) was coated on AMS and topdressed at 321 kg ha−1 (67 kg N ha−1) onto 5- to 7-leaf, 9- to 11-leaf, and first bloom cotton. In both studies, weed control and cotton tolerance to pyroxasulfone-coated AMS were compared to pyroxasulfone applied POST and POST-directed. The check in both studies received non-herbicide-treated AMS (321 kg ha−1). Before treatment applications, all plots (including the check) were maintained weed-free with glyphosate and glufosinate. In both studies, pyroxasulfone applied POST was most injurious (8% to 16%), while pyroxasulfone-coated AMS resulted in ≤4% injury. Additionally, no differences in cotton lint yield were observed in either study. With the exception of the lowest rate of AMS (161 kg ha−1; 79%), all AMS rates coated with pyroxasulfone controlled Palmer amaranth ≥83%, comparably to pyroxasulfone applied POST (92%) and POST-directed (89%). In the timing study, the application method did not affect Palmer amaranth control; however, applications made at the mid- and late timings outperformed early applications. These results indicate that pyroxasulfone-coated AMS can control Palmer amaranth comparably to pyroxasulfone applied POST and POST-directed, with minimal risk of cotton injury. However, the application timing could warrant additional treatment to achieve adequate late-season weed control.
Despite dietary guidance in over 90 countries and resources like the UK’s Eatwell guide, most individuals do not adhere to or achieve dietary aims(1,2). Specifically in the UK, population intakes of free sugars remain above the <5% recommendation, at around ∼10% of total energy intakes(3). To improve adherence to messages such as ‘reducing free sugars’, it may be helpful to identify barriers and facilitators to adherence whilst individuals attempt to modify their dietary patterns.
Participants were randomly selected from a randomised controlled trial investigating the effects of three different types of advice to reduce free sugars vs control on reducing free sugar intakes(4). A semi-structured interview explored barriers and facilitators to dietary adherence. Covariate adaptive randomisation ensured equal interviews at all timepoints across the 12-week study period and from participants in each trial arm. Data were analysed using framework analysis(5).
Sixty-two interviews were conducted across a 12-month period between 2021-2022. Seven themes for barriers and facilitators to recommendation adherence, encompassing 14 subthemes, were identified: 1) Proof and impact; 2) Realities of life; 3) Personal balance and empowerment; 4) Habitual approach; 5) Is it possible?; 6) Extensive awareness and viewpoint; and 7) Power of knowledge. Emergent themes sit within a context where individuals were challenged to reduce their intakes of free sugars and/or accurately record dietary intakes, thus they relate specifically to a dietary recording and free sugar reducing scenario. Participant interviews detected both internal and external environmental factors contributing to approaches to change. These factors were interrelated to self and community awareness, describing how individuals may utilise knowledge and understanding. Intervention participants reported all themes more than control participants; excepting the sub theme ‘limited impact.’ There were no observable reporting differences between the three intervention groups. Over the 12 -week study period, the positive sub-theme ‘enables’ within the theme ‘power of knowledge’ was more prominent at intervention delivery (week-1) than week-12. Additionally sub themes ‘active’ and ‘empower’ were reported more in those with higher adherence scores. These results suggest that dietary recommendations may need to be adapted to incorporate the stage at which dietary behavioural change takes place, with some focus also on maintenance as well as change. Overall, participant reports revealed that dietary advice needs to be appropriate for the person receiving it, easily understood, applicable, and actively engaging.
Our findings, when considered with the wider literature, may help us to better understand attempts to make dietary changes based on dietary advice, and support an individualised approach to dietary management. This greater understanding will help future advice to reduce free sugar intakes, including policy and public health initiatives.
Daily sodium intake in England is ∼3.3 g/day(1), with government and scientific advice to reduce intake for cardiovascular health purposes having varying success(2). Eccrine sweat is produced during exercise or exposure to warm environments to maintain body temperature through evaporative cooling. Sweat is primarily water, but also contains appreciable amounts of electrolytes, particularly sodium, meaning sweat sodium losses could reduce daily sodium balance without the need for dietary manipulation. However, the effects of sweat sodium losses on 24-h sodium balance are unclear.
Fourteen active participants (10 males, 4 females; 23±2 years, 45±9 mL/kg/min) completed a preliminary trial and two 24-h randomised, counterbalanced experimental trials. Participants arrived fasted for baseline (0-h) measures (blood/urine samples, blood pressure, nude body mass) followed by breakfast and low-intensity intermittent cycling in the heat (∼36⁰C, ∼50% humidity) to turnover ∼2.5% body mass in sweat (EX), or the same duration of room temperature seated rest (REST). Further blood samples were collected post-EX/REST (1.5-3 h post-baseline). During EX, sweat was collected from 5 sites and water consumed to fully replace sweat losses. During REST, participants drank 100 mL/h. Food intake was individually standardised over the 24-h, with bottled water available ad-libitum. Participants collected all urine produced over the 24-h and returned the following morning to repeat baseline measures fasted (24-h). Sodium balance was estimated over the 24-h using sweat/urine losses and dietary intake. Data were analysed using 2-way ANOVA followed by Shapiro-Wilk and paired t-tests/Wilcoxon signed-rank tests. Data are mean (standard deviation).
Dietary sodium intake was 2.3 (0.3) g and participants lost 2.8 (0.3) % body mass in sweat (containing 2.5 (0.9) g sodium). Sodium balance was lower for EX (-2.0 (1.6) g vs -1.0 (1.6) g; P = 0.022), despite lower 24-h urine sodium losses in EX (1.8 (1.2) g vs 3.3 (1.7) g; P = 0.001). PostEX/REST blood sodium concentration was lower in EX (137.6 (2.3) mmol/L vs 139.9 (1.0) mmol/L; P = 0.002) but did not differ at 0-h (P = 0.906) or 24-h (P = 0.118). There was no difference in plasma volume change (P = 0.423), urine specific gravity (P = 0.495), systolic (P = 0.324) or diastolic (P = 0.274) blood pressure between trials over the 24-h. Body mass change over 24-h was not different between trials (REST +0.25 (1.10) %; EX +0.40 (0.68) %; P = 0.663).
Sweat loss through low-intensity exercise resulted in a lower sodium balance compared to rest. Although urine sodium output reduced with EX, it was not sufficient to offset exercise-induced sodium losses. Despite this, body mass, plasma volume and blood sodium concentration were not different between trials, suggesting sodium may have been lost from non-osmotic sodium stores. This suggests sweat sodium losses could be used to reduce sodium balance, although longer studies are required to confirm this thesis.
An experiment was conducted in 2022 and 2023 near Rocky Mount and Clayton, NC, to evaluate residual herbicide-coated fertilizer for cotton tolerance and Palmer amaranth control. Treatments included acetochlor, atrazine, dimethenamid-P, diuron, flumioxazin, fluometuron, fluridone, fomesafen, linuron, metribuzin, pendimethalin, pyroxasulfone, pyroxasulfone + carfentrazone, S-metolachlor, and sulfentrazone. Each herbicide was individually coated on granular ammonium sulfate (AMS) and top-dressed at 321 kg ha−1 (67 kg N ha−1) onto 5- to 7-leaf cotton. The check plots received the equivalent rate of nonherbicide-treated AMS. Before top-dress, all plots (including the check) were treated with glyphosate and glufosinate to control previously emerged weeds. All herbicides except metribuzin resulted in transient cotton injury. Cotton response to metribuzin varied by year and location. In 2022, metribuzin caused 11% to 39% and 8% to 17% injury at the Clayton and Rocky Mount locations, respectively. In 2023, metribuzin caused 13% to 32% injury at Clayton and 73% to 84% injury at Rocky Mount. Pyroxasulfone (91%), pyroxasulfone + carfentrazone (89%), fomesafen (87%), fluridone (86%), flumioxazin (86%), and atrazine (85%) controlled Palmer amaranth ≥85%. Pendimethalin and fluometuron were the least effective treatments, resulting in 58% and 62% control, respectively. As anticipated, early season metribuzin injury translated into yield loss; plots treated with metribuzin yielded 640 kg ha−1 and were comparable to yields after linuron (790 kg ha−1) was used. These findings suggest that with the exception of metribuzin, residual herbicides coated onto AMS may be suitable and effective in cotton production, providing growers with additional modes of action for late-season control of multiple herbicide–resistant Palmer amaranth.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
The Centers for Disease Control and Prevention (CDC)-funded Cancer Prevention and Control Research Network (CPCRN) has been a leader in cancer-related dissemination & implementation (D&I) science. Given increased demand for D&I research, the CPCRN Scholars Program launched in 2021 to expand the number of practitioners, researchers, and trainees proficient in cancer D&I science methods.
Methods:
The evaluation was informed by a logic model and data collected through electronic surveys. Through an application process (baseline survey), we assessed scholars’ competencies in D&I science domains/subdomains, collected demographic data, and asked scholars to share proposed project ideas. We distributed an exit survey one month after program completion to assess scholars’ experience and engagement with the program and changes in D&I competencies. A follow-up survey was administered to alumni nine months post-program to measure their continued network engagement, accomplishments, and skills.
Results:
Three cohorts completed the program, consisting of 20, 17, and 25 scholars in Years 1-3, respectively. There was a significant increase in the total D&I competency scores for all three cohorts for 4 overarching domains and 43 subdomains (MPre = 1.38 MPost = 1.89). Differences were greatest for the domain of Practice-Based Considerations (0.50 mean difference) and Theory & Analysis (0.47 mean difference). Alumni surveys revealed that scholars appreciated access to D&I-focused webinars, toolkits, and training resources. 80% remain engaged with CPCRN workgroups and investigators.
Conclusions:
Program evaluation with scholars and alumni helped with ongoing quality assurance, introspection, and iterative program adaptation to meet scholars’ needs. This approach is recommended for large-scale capacity-building training programs.
The 2014 US National Strategy for Combating Antibiotic-Resistant Bacteria (CARB) aimed to reduce inappropriate inpatient antibiotic use by 20% for monitored conditions, such as community-acquired pneumonia (CAP), by 2020. We evaluated annual trends in length of therapy (LOT) in adults hospitalized with uncomplicated CAP from 2013 through 2020.
Methods:
We conducted a retrospective cohort study among adults with a primary diagnosis of bacterial or unspecified pneumonia using International Classification of Diseases Ninth and Tenth Revision codes in MarketScan and the Centers for Medicare & Medicaid Services databases. We included patients with length of stay (LOS) of 2–10 days, discharged home with self-care, and not rehospitalized in the 3 days following discharge. We estimated inpatient LOT based on LOS from the PINC AI Healthcare Database. The total LOT was calculated by summing estimated inpatient LOT and actual postdischarge LOT. We examined trends from 2013 to 2020 in patients with total LOT >7 days, which was considered an indicator of likely excessive LOT.
Results:
There were 44,976 and 400,928 uncomplicated CAP hospitalizations among patients aged 18–64 years and ≥65 years, respectively. From 2013 to 2020, the proportion of patients with total LOT >7 days decreased by 25% (68% to 51%) among patients aged 18–64 years and by 27% (68%–50%) among patients aged ≥65 years.
Conclusions:
Although likely excessive LOT for uncomplicated CAP patients decreased since 2013, the proportion of patients treated with LOT >7 days still exceeded 50% in 2020. Antibiotic stewardship programs should continue to pursue interventions to reduce likely excessive LOT for common infections.
To investigate the symptoms of SARS-CoV-2 infection, their dynamics and their discriminatory power for the disease using longitudinally, prospectively collected information reported at the time of their occurrence. We have analysed data from a large phase 3 clinical UK COVID-19 vaccine trial. The alpha variant was the predominant strain. Participants were assessed for SARS-CoV-2 infection via nasal/throat PCR at recruitment, vaccination appointments, and when symptomatic. Statistical techniques were implemented to infer estimates representative of the UK population, accounting for multiple symptomatic episodes associated with one individual. An optimal diagnostic model for SARS-CoV-2 infection was derived. The 4-month prevalence of SARS-CoV-2 was 2.1%; increasing to 19.4% (16.0%–22.7%) in participants reporting loss of appetite and 31.9% (27.1%–36.8%) in those with anosmia/ageusia. The model identified anosmia and/or ageusia, fever, congestion, and cough to be significantly associated with SARS-CoV-2 infection. Symptoms’ dynamics were vastly different in the two groups; after a slow start peaking later and lasting longer in PCR+ participants, whilst exhibiting a consistent decline in PCR- participants, with, on average, fewer than 3 days of symptoms reported. Anosmia/ageusia peaked late in confirmed SARS-CoV-2 infection (day 12), indicating a low discrimination power for early disease diagnosis.
To compare how clinical researchers generate data-driven hypotheses with a visual interactive analytic tool (VIADS, a visual interactive analysis tool for filtering and summarizing large datasets coded with hierarchical terminologies) or other tools.
Methods:
We recruited clinical researchers and separated them into “experienced” and “inexperienced” groups. Participants were randomly assigned to a VIADS or control group within the groups. Each participant conducted a remote 2-hour study session for hypothesis generation with the same study facilitator on the same datasets by following a think-aloud protocol. Screen activities and audio were recorded, transcribed, coded, and analyzed. Hypotheses were evaluated by seven experts on their validity, significance, and feasibility. We conducted multilevel random effect modeling for statistical tests.
Results:
Eighteen participants generated 227 hypotheses, of which 147 (65%) were valid. The VIADS and control groups generated a similar number of hypotheses. The VIADS group took a significantly shorter time to generate one hypothesis (e.g., among inexperienced clinical researchers, 258 s versus 379 s, p = 0.046, power = 0.437, ICC = 0.15). The VIADS group received significantly lower ratings than the control group on feasibility and the combination rating of validity, significance, and feasibility.
Conclusion:
The role of VIADS in hypothesis generation seems inconclusive. The VIADS group took a significantly shorter time to generate each hypothesis. However, the combined validity, significance, and feasibility ratings of their hypotheses were significantly lower. Further characterization of hypotheses, including specifics on how they might be improved, could guide future tool development.
We present and evaluate the prospects for detecting coherent radio counterparts to gravitational wave (GW) events using Murchison Widefield Array (MWA) triggered observations. The MWA rapid-response system, combined with its buffering mode ($\sim$4 min negative latency), enables us to catch any radio signals produced from seconds prior to hours after a binary neutron star (BNS) merger. The large field of view of the MWA ($\sim$$1\,000\,\textrm{deg}^2$ at 120 MHz) and its location under the high sensitivity sky region of the LIGO-Virgo-KAGRA (LVK) detector network, forecast a high chance of being on-target for a GW event. We consider three observing configurations for the MWA to follow up GW BNS merger events, including a single dipole per tile, the full array, and four sub-arrays. We then perform a population synthesis of BNS systems to predict the radio detectable fraction of GW events using these configurations. We find that the configuration with four sub-arrays is the best compromise between sky coverage and sensitivity as it is capable of placing meaningful constraints on the radio emission from 12.6% of GW BNS detections. Based on the timescales of four BNS merger coherent radio emission models, we propose an observing strategy that involves triggering the buffering mode to target coherent signals emitted prior to, during or shortly following the merger, which is then followed by continued recording for up to three hours to target later time post-merger emission. We expect MWA to trigger on $\sim$$5-22$ BNS merger events during the LVK O4 observing run, which could potentially result in two detections of predicted coherent emission.