We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Traditional foods are increasingly being incorporated into modern diets. This is largely driven by consumers seeking alternative food sources that have superior nutritional and functional properties. Within Australia, Aboriginal and Torres Strait Islander peoples are looking to develop their traditional foods for commercial markets. However, supporting evidence to suggest these foods are safe for consumption within the wider general population is limited. At the 2022 NSA conference a keynote presentation titled ‘Decolonising food regulatory frameworks to facilitate First Peoples food sovereignty’ was presented. This presentation was followed by a manuscript titled ‘Decolonising food regulatory frameworks: Importance of recognising traditional culture when assessing dietary safety of traditional foods’, which was published in the conference proceedings journal(1). These pieces examined the current regulatory frameworks that are used to assess traditional foods and proposed a way forward that would allow Traditional Custodians to successfully develop their foods for modern markets. Building upon the previously highlighted works, this presentation will showcase best practice Indigenous engagement and collaboration principles in the development of traditionally used food products. To achieve this, we collaborated with a collective of Gamilaraay peoples who are looking to reignite their traditional grain practices and develop grain-based food products. To meet the current food safety regulatory requirements, we needed to understand how this grain would fit into modern diets, which included understanding the history of use, elucidating the nutritional and functional properties that can be attributed to the grain, and developing a safety dossier(2) so that the Traditional Custodians can confidently take their product to market. To aid the Traditional Custodians in performing their due diligence, we have systemically analysed the dietary safety of the selected native grain and compared it side-by-side with commonly consumed wheat in a range of in vitro bioassays and chemical analyses. From a food safety perspective, we show that the native grain is equivalent to commonly consumed wheat. The native grain has been shown to be no more toxic than wheat within our biological screening systems. Chemical analysis showed that the level of contaminants are below tolerable limits, and we were not able to identify any chemical classes of concern. Our initial findings support the history of safe use and suggest that the tested native grain species would be no less safe than commonly consumed wheat. This risk assessment and previously published nutritional study(3) provides an overall indication that the grain is nutritionally superior and viable for commercial development. The learnings from this project can direct the future risk assessment of traditional foods and therefore facilitate the safe market access of a broader range of traditionally used foods. Importantly, the methods presented are culturally safe and financially viable for the small businesses hoping to enter the market.
Low vitamin D associated with high parathyroid hormone (PTH) is common in HIV infection. We determined the association between total 25(OH)D and PTH in adolescents living with HIV, in Zambia and Zimbabwe. Adolescents (11–19 years) perinatally infected with HIV and established on antiretroviral therapy for ≥ 6 months were recruited into a cross-sectional study. Socio-demographic and clinical characteristics were recorded, anthropometry measured and fasted serum concentrations of 1,25(OH)2D, total 25(OH)D and intact PTH measured. The association between total 25(OH)D and PTH was examined using natural cubic spline regression. 842 participants (female: 53·2%) with a median age of 15·5 (IQR: 13·2–17·9) years were enrolled. Median antiretroviral therapy duration was 9·8 (IQR: 6·3–12·3) years, and 165/841 had an HIV viral-load >60 copies/ml. Stunting (height-for-age z-score <–2) and underweight (weight-for-age z-score <–2) were observed in 29·9 and 30·0%, respectively. Three-quarters reported daily Ca intakes <150 mg/d. The mean (sd) concentrations of total 25(OH)D and 1,25(OH)2D were 66·1(16·5) nmol/l and 210·6 (70·4) pmol/l, respectively, and median PTH level was 4·3 (IQR: 3·3–5·5) pmol/l. There was an inverse non-linear relationship between total 25(OH)D and PTH, 25(OH)D levelling off at 74·6 nmol/l (95 % CI: 74·5, 75·2). Results were consistent in those taking tenofovir disoproxil fumarate and virally unsuppressed participants. In this population with extremely low habitual Ca intakes, the lack of association between 25(OH)D and PTH when 25(OH)D exceeded 75 nmol/l potentially suggests that levels of 25(OH)D >75 nmol/l may need to be achieved to improve bone health; investigation is needed in future research studies.
Partial remission after major depressive disorder (MDD) is common and a robust predictor of relapse. However, it remains unclear to which extent preventive psychological interventions reduce depressive symptomatology and relapse risk after partial remission. We aimed to identify variables predicting relapse and to determine whether, and for whom, psychological interventions are effective in preventing relapse, reducing (residual) depressive symptoms, and increasing quality of life among individuals in partial remission. This preregistered (CRD42023463468) systematic review and individual participant data meta-analysis (IPD-MA) pooled data from 16 randomized controlled trials (n = 705 partial remitters) comparing psychological interventions to control conditions, using 1- and 2-stage IPD-MA. Among partial remitters, baseline clinician-rated depressive symptoms (p = .005) and prior episodes (p = .012) predicted relapse. Psychological interventions were associated with reduced relapse risk over 12 months (hazard ratio [HR] = 0.60, 95% confidence interval [CI] 0.43–0.84), and significantly lowered posttreatment depressive symptoms (Hedges’ g = 0.29, 95% CI 0.04–0.54), with sustained effects at 60 weeks (Hedges’ g = 0.33, 95% CI 0.06–0.59), compared to nonpsychological interventions. However, interventions did not significantly improve quality of life at 60 weeks (Hedges’ g = 0.26, 95% CI -0.06 to 0.58). No moderators of relapse prevention efficacy were found. Men, older individuals, and those with higher baseline symptom severity experienced greater reductions in symptomatology at 60 weeks. Psychological interventions for individuals with partially remitted depression reduce relapse risk and residual symptomatology, with efficacy generalizing across patient characteristics and treatment types. This suggests that psychological interventions are a recommended treatment option for this patient population.
Novel management strategies for controlling smutgrass have potential to influence sward dynamics in bahiagrass forage systems. This experiment evaluated population shifts in bahiagrass forage following implementation of integrated herbicide and fertilizer management plans for controlling smutgrass. Herbicide treatments included indaziflam applied PRE, hexazinone applied POST, a combination of PRE + POST herbicides, and a nonsprayed control. Fertilizer treatments included nitrogen, nitrogen + potassium, and an unfertilized control. The POST treatment reduced smutgrass coverage regardless of PRE or fertilizer application by the end of the first season and remained low for the 3-yr duration of the experiment (P < 0.01). All treatments, including nontreated controls, reduced smutgrass coverage during year 3 (P < 0.05), indicating that routine harvesting to remove the biomass reduced smutgrass coverage. Bahiagrass cover increased at the end of year 1 with POST treatment (P < 0.01), but only the POST + fertilizer treatment maintained greater bahiagrass coverage than the nontreated control by the end of year 3 (P < 0.05). Expenses associated with the POST + fertilizer treatment totaled US$348 ha−1 across the 3-yr experiment. Other smutgrass control options could include complete removal of biomass (hay production) and pasture renovation, which can cost 3-fold or greater more than POST + fertilizer treatment. Complete removal of biomass may reduce smutgrass coverage by removing mature seedheads, but at a much greater expense of US$2,835 to US$5,825 ha−1, depending on herbicide and fertilizer inputs. Bahiagrass renovation is US$826 ha−1 in establishment costs alone. When pasture production expenses are included for two seasons postrenovation, the total increases to US$1,120 ha−1 across three seasons. The importance of hexazinone and fertilizer as components of smutgrass control in bahiagrass forage was confirmed in this study. Future research should focus on the biology of smutgrass and the role of a PRE treatment in a long-term, larger-scale forage system.
Scholarly and practitioner interest in authentic leadership has grown at an accelerating rate over the last decade, resulting in a proliferation of publications across diverse social science disciplines. Accompanying this interest has been criticism of authentic leadership theory and the methods used to explore it. We conducted a systematic review of 303 scholarly articles published from 2010 to 2023 to critically assess the conceptual and empirical strengths and limitations of this literature and map the nomological network of the authentic leadership construct. Results indicate that much of the extant research does not follow best practices in terms of research design and analysis. Based on the findings obtained, an agenda for advancing authentic leadership theory and research that embraces a signaling theory perspective is proposed.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Palmer amaranth (Amaranthus palmeri S. Watson, AMAPA) is one of the most troublesome weeds in North America due to its rapid growth rate, substantial seed production, competitiveness and the evolution of herbicide-resistant populations. Though frequently encountered in the South, Midwest, and Mid-Atlantic regions of the United States, A. palmeri was recently identified in soybean [Glycine max (L.) Merr.] fields in Genesee, Orange, and Steuben counties, NY, where glyphosate was the primary herbicide for in-crop weed control. This research, conducted in 2023, aimed to (1) describe the dose response of three putative resistant NY A. palmeri populations to glyphosate, (2) determine their mechanisms of resistance, and (3) assess their sensitivity to other postemergence herbicides commonly used in NY crop production systems. Based on the effective dose necessary to reduce aboveground biomass by 50% (ED50), the NY populations were 42 to 67 times more resistant to glyphosate compared with a glyphosate-susceptible population. Additionally, the NY populations had elevated EPSPS gene copy numbers ranging from 25 to 135 located within extrachromosomal circular DNA (eccDNA). Label rate applications of Weed Science Society of America (WSSA) Group 2 herbicides killed up to 42% of the NY populations of A. palmeri. Some variability was observed among populations in response to WSSA Group 5 and 27 herbicides. All populations were effectively controlled by labeled rates of herbicides belonging to WSSA Groups 4, 10, 14, and 22. Additional research is warranted to confirm whether NY populations have evolved multiple resistance to herbicides within other WSSA groups and to develop effective A. palmeri management strategies suitable for NY crop production.
From early on, infants show a preference for infant-directed speech (IDS) over adult-directed speech (ADS), and exposure to IDS has been correlated with language outcome measures such as vocabulary. The present multi-laboratory study explores this issue by investigating whether there is a link between early preference for IDS and later vocabulary size. Infants’ preference for IDS was tested as part of the ManyBabies 1 project, and follow-up CDI data were collected from a subsample of this dataset at 18 and 24 months. A total of 341 (18 months) and 327 (24 months) infants were tested across 21 laboratories. In neither preregistered analyses with North American and UK English, nor exploratory analyses with a larger sample did we find evidence for a relation between IDS preference and later vocabulary. We discuss implications of this finding in light of recent work suggesting that IDS preference measured in the laboratory has low test-retest reliability.
Many preoperative urine cultures are of low value and may even lead to patient harms. This study sought to understand practices around ordering preoperative urine cultures and prescribing antibiotic treatment.
We interviewed participants using a qualitative semi-structured interview guide. Collected data was coded inductively and with the Dual Process Model (DPM) using MAXQDA software. Data in the “Testing Decision-Making” code was further reviewed using the concept of perceived risk as a sensitizing concept.
Results:
We identified themes relating to surgeons’ concerns about de-implementing preoperative urine cultures to detect asymptomatic bacteriuria (ASB) in patients undergoing non-urological procedures: (1) anxiety and uncertainty surrounding missing infection signs spanned surgical specialties, (2) there were perceived risks of negative consequences associated with omitting urine cultures and treatment prior to specific procedure sites and types, and additionally, (3) participants suggested potential routes for adjusting these perceived risks to facilitate de-implementation acceptance. Notably, participants suggested that leadership support and peer engagement could help improve surgeon buy-in.
Conclusions:
Concerns about perceived risks sometimes outweigh the evidence against routine preoperative urine cultures to detect ASB. Evidence from trusted peers may improve openness to de-implementing preoperative urine cultures.
Depression is an independent risk factor for cardiovascular disease (CVD), but it is unknown if successful depression treatment reduces CVD risk.
Methods
Using eIMPACT trial data, we examined the effect of modernized collaborative care for depression on indicators of CVD risk. A total of 216 primary care patients with depression and elevated CVD risk were randomized to 12 months of the eIMPACT intervention (internet cognitive-behavioral therapy [CBT], telephonic CBT, and select antidepressant medications) or usual primary care. CVD-relevant health behaviors (self-reported CVD prevention medication adherence, sedentary behavior, and sleep quality) and traditional CVD risk factors (blood pressure and lipid fractions) were assessed over 12 months. Incident CVD events were tracked over four years using a statewide health information exchange.
Results
The intervention group exhibited greater improvement in depressive symptoms (p < 0.01) and sleep quality (p < 0.01) than the usual care group, but there was no intervention effect on systolic blood pressure (p = 0.36), low-density lipoprotein cholesterol (p = 0.38), high-density lipoprotein cholesterol (p = 0.79), triglycerides (p = 0.76), CVD prevention medication adherence (p = 0.64), or sedentary behavior (p = 0.57). There was an intervention effect on diastolic blood pressure that favored the usual care group (p = 0.02). The likelihood of an incident CVD event did not differ between the intervention (13/107, 12.1%) and usual care (9/109, 8.3%) groups (p = 0.39).
Conclusions
Successful depression treatment alone is not sufficient to lower the heightened CVD risk of people with depression. Alternative approaches are needed.
The crystallite sizes in the particles from four fractions of a kaolinite-clay were determined from the broadening of the X-ray diffraction lines. Measurements were made of the <002> and <111> planes whose crystallographic directions correspond to the clay plate thickness and diagonal, respectively. The extent of crystal imperfection was determined by comparing the calculated crystallite size with the mean size based on measurements from electron micrographs. The crystal imperfections were found to be more extensive in the plate diagonal, <111>, than in the plate face, <002>, directions. Electron micrographs of hydrofluoric acid-etched samples revealed plate-edge and plate-face imperfections. The latter show a regularity suggesting a mosaic-like texture in the plate surface. Surface imperfections probably have significant influence on the dispersion and flocculation behavior of kaolinite.
Wastewater-based epidemiology (WBE) has proven to be a powerful tool for the population-level monitoring of pathogens, particularly severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). For assessment, several wastewater sampling regimes and methods of viral concentration have been investigated, mainly targeting SARS-CoV-2. However, the use of passive samplers in near-source environments for a range of viruses in wastewater is still under-investigated. To address this, near-source passive samples were taken at four locations targeting student hall of residence. These were chosen as an exemplar due to their high population density and perceived risk of disease transmission. Viruses investigated were SARS-CoV-2 and its variants of concern (VOCs), influenza viruses, and enteroviruses. Sampling was conducted either in the morning, where passive samplers were in place overnight (17 h) and during the day, with exposure of 7 h. We demonstrated the usefulness of near-source passive sampling for the detection of VOCs using quantitative polymerase chain reaction (qPCR) and next-generation sequencing (NGS). Furthermore, several outbreaks of influenza A and sporadic outbreaks of enteroviruses (some associated with enterovirus D68 and coxsackieviruses) were identified among the resident student population, providing evidence of the usefulness of near-source, in-sewer sampling for monitoring the health of high population density communities.
We present a demonstration version of a commensal pipeline for Fast Radio Burst (FRB) searches using a real-time incoherent beam from the Murchison Widefield Array (MWA). The main science target of the pipeline are bright nearby FRBs from the local Universe (including Galactic FRBs like from SGR 1935+2154) which are the best candidates to probe FRB progenitors and understand physical mechanisms powering these extremely energetic events. Recent FRB detections by LOFAR (down to 110 MHz), the Green Bank Telescope (at 350 MHz), and Canadian Hydrogen Intensity Mapping Experiment (CHIME) detections extending down to 400 MHz, prove that there is a population of FRBs that can be detected below 350 MHz. The new MWA beamformer, known as the ‘MWAX multibeam beamformer’, can form multiple incoherent and coherent beams (with different parameters) commensally to any ongoing MWA observations. One of the beams is currently used for FRB searches (tested in 10 kHz frequency resolution and time resolutions between 0.1 and 100 ms). A second beam (in 1 Hz and 1 s frequency and time resolutions, respectively) is used for the Search for Extraterrestrial Intelligence (SETI) project. This paper focuses on the FRB search pipeline and its verification on selected known bright pulsars. The pipeline uses the FREDDA implementation of the Fast Dispersion Measure Transform algorithm (FDMT) for single pulse searches. Initially, it was tested during standard MWA observations, and more recently using dedicated observations of a sample of 11 bright pulsars. The pulsar PSR J0835-4510 (Vela) has been routinely used as the primary probe of the data quality because its folded profile was always detected in the frequency band 200 – 230 MHz with typical signal-to-noise ratio $>$10, which agrees with the expectations. Similarly, the low dispersion measure pulsar PSR B0950+08 was always detected in folded profile in the frequency band 140–170 MHz and so far has been the only object for which single pulses were detected. We present the estimated sensitivity of the search in the currently limited observing bandwidth of a single MWA coarse channel (1.28 MHz) and for the upgraded, future system with 12.8 MHz (10 channels) of bandwidth. Based on expected sensitivity and existing FRB rate measurements, we project an FRB detection rate between a few and a few tens per year with large uncertainty due to unknown FRB rates at low frequencies.
Interest in hydrotalcite-like compounds has grown due to their role in controlling the mobility of aqueous metals in the environment as well as their use as catalysts, catalyst precursors and specialty chemicals. Although these materials have been studied in a number of contexts, little is known of their thermodynamic properties. High-temperature oxide melt solution calorimetry was used to measure the standard enthalpy of formation for compounds M(II)1−xAlx(OH)2(CO3)x/2·mH2O (0.2 < x < 0.4, M(II) = Mg, Co, Ni and Zn). The enthalpy of formation of these compounds from the relevant single cation phases was also determined. The formation of HTLCs results in a 5–20 kJ/mol enthalpy stabilization from the single cation hydroxides and carbonates and water. The data are correlated to two variables: the ratio of divalent to trivalent cation in the solid (M(II)/Al) and the identity of the divalent cation. It was observed that the M(II)/Al ratio exerts a minor influence on the enthalpy of formation from single-cation phases, while greater differences in stabilization resulted from changes in the chemical nature of the divalent cation. However, the data do not support any statistically significant correlation between the composition of HTLCs and their heats of formation. Equilibrium geochemical calculations based upon the thermodynamic data illustrate the effect of HTLCs on the speciation of metals in natural waters. These calculations show that, in many cases, HTLCs form even in waters that are undersaturated with respect to the individual divalent metal hydroxides and carbonates. Phase diagrams and stability diagrams involving Ni-bearing HTLCs and the single-cation components are presented. The Ni(II) concentration as a function of pH as well as the stability diagram for the equilibrium among minerals in the CaO-NiO-Al2O3-SiO2-CO2-H2O system at 298 K are plotted.
White matter hyperintensity (WMH) burden is greater, has a frontal-temporal distribution, and is associated with proxies of exposure to repetitive head impacts (RHI) in former American football players. These findings suggest that in the context of RHI, WMH might have unique etiologies that extend beyond those of vascular risk factors and normal aging processes. The objective of this study was to evaluate the correlates of WMH in former elite American football players. We examined markers of amyloid, tau, neurodegeneration, inflammation, axonal injury, and vascular health and their relationships to WMH. A group of age-matched asymptomatic men without a history of RHI was included to determine the specificity of the relationships observed in the former football players.
Participants and Methods:
240 male participants aged 45-74 (60 unexposed asymptomatic men, 60 male former college football players, 120 male former professional football players) underwent semi-structured clinical interviews, magnetic resonance imaging (structural T1, T2 FLAIR, and diffusion tensor imaging), and lumbar puncture to collect cerebrospinal fluid (CSF) biomarkers as part of the DIAGNOSE CTE Research Project. Total WMH lesion volumes (TLV) were estimated using the Lesion Prediction Algorithm from the Lesion Segmentation Toolbox. Structural equation modeling, using Full-Information Maximum Likelihood (FIML) to account for missing values, examined the associations between log-TLV and the following variables: total cortical thickness, whole-brain average fractional anisotropy (FA), CSF amyloid ß42, CSF p-tau181, CSF sTREM2 (a marker of microglial activation), CSF neurofilament light (NfL), and the modified Framingham stroke risk profile (rFSRP). Covariates included age, race, education, APOE z4 carrier status, and evaluation site. Bootstrapped 95% confidence intervals assessed statistical significance. Models were performed separately for football players (college and professional players pooled; n=180) and the unexposed men (n=60). Due to differences in sample size, estimates were compared and were considered different if the percent change in the estimates exceeded 10%.
Results:
In the former football players (mean age=57.2, 34% Black, 29% APOE e4 carrier), reduced cortical thickness (B=-0.25, 95% CI [0.45, -0.08]), lower average FA (B=-0.27, 95% CI [-0.41, -.12]), higher p-tau181 (B=0.17, 95% CI [0.02, 0.43]), and higher rFSRP score (B=0.27, 95% CI [0.08, 0.42]) were associated with greater log-TLV. Compared to the unexposed men, substantial differences in estimates were observed for rFSRP (Bcontrol=0.02, Bfootball=0.27, 994% difference), average FA (Bcontrol=-0.03, Bfootball=-0.27, 802% difference), and p-tau181 (Bcontrol=-0.31, Bfootball=0.17, -155% difference). In the former football players, rFSRP showed a stronger positive association and average FA showed a stronger negative association with WMH compared to unexposed men. The effect of WMH on cortical thickness was similar between the two groups (Bcontrol=-0.27, Bfootball=-0.25, 7% difference).
Conclusions:
These results suggest that the risk factor and biological correlates of WMH differ between former American football players and asymptomatic individuals unexposed to RHI. In addition to vascular risk factors, white matter integrity on DTI showed a stronger relationship with WMH burden in the former football players. FLAIR WMH serves as a promising measure to further investigate the late multifactorial pathologies of RHI.
Repetitive transcranial magnetic stimulation (TMS) is an evidenced based treatment for adults with treatment resistant depression (TRD). The standard clinical protocol for TMS is to stimulate the left dorsolateral prefrontal cortex (DLPFC). Although the DLPFC is a defining region in the cognitive control network of the brain and implicated in executive functions such as attention and working memory, we lack knowledge about whether TMS improves cognitive function independent of depression symptoms. This exploratory analysis sought to address this gap in knowledge by assessing changes in attention before and after completion of a standard treatment with TMS in Veterans with TRD.
Participants and Methods:
Participants consisted of 7 Veterans (14.3% female; age M = 46.14, SD = 7.15; years education M = 16.86, SD = 3.02) who completed a full 30-session course of TMS treatment and had significant depressive symptoms at baseline (Patient Health Questionnaire-9; PHQ-9 score >5). Participants were given neurocognitive assessments measuring aspects of attention [Wechsler Adult Intelligence Scale 4th Edition (WAIS-IV) subtests: Digits Forward, Digits Backward, and Number Sequencing) at baseline and again after completion of TMS treatment. The relationship between pre and post scores were examined using paired-samples t-test for continuous variables and a linear regression to covary for depression and posttraumatic stress disorder (PTSD), which is often comorbid with depression in Veteran populations.
Results:
There was a significant improvement in Digit Span Forward (p=.01, d=-.53), but not Digit Span Backward (p=.06) and Number Sequencing (p=.54) post-TMS treatment. Depression severity was not a significant predictor of performance on Digit Span Forward (f(1,5)=.29, p=.61) after TMS treatment. PTSD severity was also not a significant predictor of performance on Digit Span Forward (f(1,5)=1.31, p=.32).
Conclusions:
Findings suggested that a standard course of TMS improves less demanding measures of working memory after a full course of TMS, but possibly not the more demanding aspects of working memory. This improvement in cognitive function was independent of improvements in depression and PTSD symptoms. Further investigation in a larger sample and with direct neuroimaging measures of cognitive function is warranted.
To characterise transesophageal echocardiography practice patterns among paediatric cardiac surgical centres in the United States and Canada.
Methods:
A 42-question survey was sent to 80 echocardiography laboratory directors at paediatric cardiology centres with surgical programmes in the United States and Canada. Question domains included transesophageal echocardiography centre characteristics, performance and reporting, equipment use, trainee participation, and quality assurance.
Results:
Fifty of the 80 centres (62.5%) responded to the survey. Most settings were academic (86.0%) with 42.0% of centres performing > 350 surgical cases/year. The median number of transesophageal echocardiograms performed/cardiologist/year was 50 (26, 73). Pre-operative transesophageal echocardiography was performed in most surgical cases by 91.7% of centres. Transesophageal echocardiography was always performed by most centres following Norwood, Glenn, and Fontan procedures and by < 10% of centres following coarctation repair. Many centres with a written guideline allowed transesophageal echocardiography transducer use at weights below manufacturer recommendations (50.0 and 61.1% for neonatal and paediatric transducers, respectively). Most centres (36/37, 97.3%) with categorical fellowships had rotations which included transesophageal echocardiography participation. Large surgical centres (>350 cases/year) had higher median number of transesophageal echocardiograms/cardiologist/year (75.5 [53, 86] versus 35 [20, 52], p < 0.001) and more frequently used anaesthesia for diagnostic transesophageal echocardiography ≥ 67% of time (100.0 versus 62.1%, p = 0.001).
Conclusions:
There is significant variability in transesophageal echocardiography practice patterns and training requirements among paediatric cardiology centres in the United States and Canada. Findings may help inform programmatic decisions regarding transesophageal echocardiography expectations, performance and reporting, equipment use, trainee involvement, and quality assurance.
We present and evaluate the prospects for detecting coherent radio counterparts to gravitational wave (GW) events using Murchison Widefield Array (MWA) triggered observations. The MWA rapid-response system, combined with its buffering mode ($\sim$4 min negative latency), enables us to catch any radio signals produced from seconds prior to hours after a binary neutron star (BNS) merger. The large field of view of the MWA ($\sim$$1\,000\,\textrm{deg}^2$ at 120 MHz) and its location under the high sensitivity sky region of the LIGO-Virgo-KAGRA (LVK) detector network, forecast a high chance of being on-target for a GW event. We consider three observing configurations for the MWA to follow up GW BNS merger events, including a single dipole per tile, the full array, and four sub-arrays. We then perform a population synthesis of BNS systems to predict the radio detectable fraction of GW events using these configurations. We find that the configuration with four sub-arrays is the best compromise between sky coverage and sensitivity as it is capable of placing meaningful constraints on the radio emission from 12.6% of GW BNS detections. Based on the timescales of four BNS merger coherent radio emission models, we propose an observing strategy that involves triggering the buffering mode to target coherent signals emitted prior to, during or shortly following the merger, which is then followed by continued recording for up to three hours to target later time post-merger emission. We expect MWA to trigger on $\sim$$5-22$ BNS merger events during the LVK O4 observing run, which could potentially result in two detections of predicted coherent emission.