We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Vascular rings represent a heterogeneous set of aberrant great vessel anatomic configurations which can cause respiratory symptoms or dysphagia due to tracheal or oesophageal compression. These symptoms can be subtle and may present at varied ages. More recently, many have been identified in patients without symptoms, including fetal echocardiogram, resulting in a conundrum for practitioners when attempting to determine who will benefit from surgical correction. Here, we provide a review of vascular rings and a guide to the practitioner on when to consider additional imaging or referral. Additionally, we discuss the changing landscape regarding asymptomatic patients and fetal echocardiogram.
The opportunity to increase soybean yield has prompted Illinois farmers to plant soybean earlier than historical norms. Extending the growing season with an earlier planting date might alter the relationship between soybean growth and weed emergence timings, potentially altering the optimal herbicide application timings to minimize crop yield loss due to weed interference and ensure minimal weed seed production. The objective of this research was to examine various herbicide treatments applied at different timings and rates to assess the effect on weed control and yield in early-planted soybean. Field experiments were conducted in 2021 at three locations across central Illinois to determine effective chemical strategies for weed management in early-planted soybean. PRE treatments consisted of a S-metolachlor + metribuzin premix applied at planting or just prior to soybean emergence at 0.5X (883 + 210 g ai ha−1) or 1X (1,766 + 420 g ai ha−1) label-recommended rates. POST treatments were applied when weeds reached 10 cm tall and consisted of 1X rates of glufosinate (655 g ai ha−1) + glyphosate (1,260 g ae ha−1) + ammonium sulfate, without or with pyroxasulfone at a 0.5X (63 g ai ha−1) or 1X (126 g ai ha−1) rate. Treatments comprising both a full rate of PRE followed by a POST resulted in the greatest and most consistent weed control at the final evaluation timing. The addition of pyroxasulfone to POST treatments did not consistently reduce late-season weed emergence. The lack of a consistent effect by pyroxasulfone could be attributed to suppression of weeds by soybean canopy closure due to earlier soybean development. The full rate of PRE extended the timing of POST application 2 to 3 wk for all treatments at all locations except Urbana. Full-rate PRE treatments also reduced the time between the POST application and soybean canopy closure. Overall, a full-rate PRE reduced early-season weed interference and minimized soybean yield loss due to weed interference.
Foliar-applied postemergence applications of glufosinate are often applied to glufosinate-resistant crops to provide nonselective weed control without significant crop injury. Rainfall, air temperature, solar radiation, and relative humidity near the time of application have been reported to affect glufosinate efficacy. However, previous research may have not captured the full range of weather variability to which glufosinate may be exposed before or following application. Additionally, climate models suggest more extreme weather will become the norm, further expanding the weather range to which glufosinate can be exposed. The objective of this research was to quantify the probability of successful weed control (efficacy ≥85%) with glufosinate applied to some key weed species across a broad range of weather conditions. A database of >10,000 North American herbicide evaluation trials was used in this study. The database was filtered to include treatments with a single postemergence application of glufosinate applied to waterhemp [Amaranthus tuberculatus (Moq.) Sauer], morningglory species (Ipomoea spp.), and/or giant foxtail (Setaria faberi Herrm.) <15 cm in height. These species were chosen because they are well represented in the database and listed as common and troublesome weed species in both corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] (Van Wychen 2020, 2022). Individual random forest models were created. Low rainfall (≤20 mm) over the 5 d before glufosinate application was detrimental to the probability of successful control of A. tuberculatus and S. faberi. Lower relative humidity (≤70%) and solar radiation (≤23 MJ m−1 d−1) on the day of application reduced the probability of successful weed control in most cases. Additionally, the probability of successful control decreased for all species when average air temperature over the first 5 d after application was ≤25 C. As climate continues to change and become more variable, the risk of unacceptable control of several common species with glufosinate is likely to increase.
A nonparametric test of dispersion with paired replicates data is described which involves jackknifing logarithmic transformations of the ratio of variance estimates for the pre- and post-treatment populations. Results from a Monte Carlo simulation show that the test performs well under Ho and has good power properties. Examples are given of applying the procedure on psychiatric data.
Free-recall verbal learning is analyzed in terms of a probability model. The general theory assumes that the probability of recalling a word on any trial is completely determined by the number of times the word has been recalled on previous trials. Three particular cases of this general theory are examined. In these three cases, specific restrictions are placed upon the relation between probability of recall and number of previous recalls. The application of these special cases to typical experimental data is illustrated. An interpretation of the model in terms of set theory is suggested but is not essential to the argument.
Foliar-applied postemergence herbicides are a critical component of corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] weed management programs in North America. Rainfall and air temperature around the time of application may affect the efficacy of herbicides applied postemergence in corn or soybean production fields. However, previous research utilized a limited number of site-years and may not capture the range of rainfall and air temperatures that these herbicides are exposed to throughout North America. The objective of this research was to model the probability of achieving successful weed control (≥85%) with commonly applied postemergence herbicides across a broad range of environments. A large database of more than 10,000 individual herbicide evaluation field trials conducted throughout North America was used in this study. The database was filtered to include only trials with a single postemergence application of fomesafen, glyphosate, mesotrione, or fomesafen + glyphosate. Waterhemp [Amaranthus tuberculatus (Moq.) Sauer], morningglory species (Ipomoea spp.), and giant foxtail (Setaria faberi Herrm.) were the weeds of focus. Separate random forest models were created for each weed species by herbicide combination. The probability of successful weed control deteriorated when the average air temperature within the first 10 d after application was <19 or >25 C for most of the herbicide by weed species models. Additionally, drier conditions before postemergence herbicide application reduced the probability of successful control for several of the herbicide by weed species models. As air temperatures increase and rainfall becomes more variable, weed control with many of the commonly used postemergence herbicides is likely to become less reliable.
How was trust created and reinforced between the inhabitants of medieval and early modern cities? And how did the social foundations of trusting relationships change over time? Current research highlights the role of kinship, neighbourhood, and associations, particularly guilds, in creating ‘relationships of trust’ and social capital in the face of high levels of migration, mortality, and economic volatility, but tells us little about their relative importance or how they developed. We uncover a profound shift in the contribution of family and guilds to trust networks among the middling and elite of one of Europe's major cities, London, over three centuries, from the 1330s to the 1680s. We examine almost 15,000 networks of sureties created to secure orphans’ inheritances to measure the presence of trusting relationships connected by guild membership, family, and place. We uncover a profound increase in the role of kinship – a re-embedding of trust within the family – and a decline of the importance of shared guild membership in connecting Londoners who secured orphans’ inheritances together. These developments indicate a profound transformation in the social fabric of urban society.
Hemodynamic collapse in multi-trauma patients with severe traumatic brain injury (TBI) poses both a diagnostic and therapeutic challenge for prehospital clinicians. Brain injury associated shock (BIAS), likely resulting from catecholamine storm, can cause both ventricular dysfunction and vasoplegia but may present clinically in a manner similar to hemorrhagic shock. Despite different treatment strategies, few studies exist describing this phenomenon in the early post-injury phase. This retrospective observational study aimed to describe the frequency of shock in isolated TBI in prehospital trauma patients and to compare their clinical characteristics to those patients with hemorrhagic shock and TBI without shock.
Methods:
All prehospital trauma patients intubated by prehospital medical teams from New South Wales Ambulance Aeromedical Operations (NSWA-AO) with an initial Glasgow Coma Scale (GCS) of 12 or less were investigated. Shock was defined as a pre-intubation systolic blood pressure under 90mmHg and the administration of blood products or vasopressors. Injuries were classified from in-hospital computed tomography (CT) reports. From this, three study groups were derived: BIAS, hemorrhagic shock, and isolated TBI without shock. Descriptive statistics were then produced for clinical and treatment variables.
Results:
Of 1,292 intubated patients, 423 had an initial GCS of 12 or less, 24 patients (5.7% of the original cohort) had shock with an isolated TBI, and 39 patients had hemorrhagic shock. The hemodynamic parameters were similar amongst these groups, including values of tachycardia, hypotension, and elevated shock index. Prehospital clinical interventions including blood transfusion and total fluids administered were also similar, suggesting they were indistinguishable to prehospital clinicians.
Conclusions:
Hemodynamic compromise in the setting of isolated severe TBI is a rare clinical entity. Current prehospital physiological data available to clinicians do not allow for easy delineation between these patients from those with hemorrhagic shock.
The subject of this chapter is Grágás, the compilation of the laws of Iceland in the Commonwealth period. The chapter begins by outlining the court structure of Iceland and the fundamentals of legal procedure, briefly discussing the importance of law to the conversion narrative in Íslendingabók and its account of the first decision to put Iceland’s laws into writing. It describes the distinctive concepts and customs which underlie the legal system of medieval Iceland, looking at the role of the búi (neighbour) in legal procedure, and explaining the key concepts of helgi (the right of inviolability), grið (domicile, or household attachment), vígt (the right to kill or to avenge a wrong with impunity), and the problem of dealing with ómagar (dependants). The chapter argues that the laws and sagas are often mutually informing and demonstrates how fundamental an understanding of law is to the interpretation of the Íslendingasögur. It gives numerous examples of how the laws can be used to help elucidate the sagas, and uses the sagas to reveal the importance of law and legal knowledge in medieval Icelandic society.
Children with CHD or born very preterm are at risk for brain dysmaturation and poor neurodevelopmental outcomes. Yet, studies have primarily investigated neurodevelopmental outcomes of these groups separately.
Objective:
To compare neurodevelopmental outcomes and parent behaviour ratings of children born term with CHD to children born very preterm.
Methods:
A clinical research sample of 181 children (CHD [n = 81]; very preterm [≤32 weeks; n = 100]) was assessed at 18 months.
Results:
Children with CHD and born very preterm did not differ on Bayley-III cognitive, language, or motor composite scores, or on expressive or receptive language, or on fine motor scaled scores. Children with CHD had lower ross motor scaled scores compared to children born very preterm (p = 0.047). More children with CHD had impaired scores (<70 SS) on language composite (17%), expressive language (16%), and gross motor (14%) indices compared to children born very preterm (6%; 7%; 3%; ps < 0.05). No group differences were found on behaviours rated by parents on the Child Behaviour Checklist (1.5–5 years) or the proportion of children with scores above the clinical cutoff. English as a first language was associated with higher cognitive (p = 0.004) and language composite scores (p < 0.001). Lower median household income and English as a second language were associated with higher total behaviour problems (ps < 0.05).
Conclusions:
Children with CHD were more likely to display language and motor impairment compared to children born very preterm at 18 months. Outcomes were associated with language spoken in the home and household income.
Traumatic brain injury (TBI) and concussion are associated with increased dementia risk. Accurate TBI/concussion exposure estimates are relatively unknown for less common neurodegenerative conditions like frontotemporal dementia (FTD). We evaluated lifetime TBI and concussion frequency in patients diagnosed with a range of FTD spectrum conditions and related prior head trauma to cavum septum pellucidum (CSP) characteristics observable on MRI.
Participants and Methods:
We administered the Ohio State University TBI Identification and Boston University Head Impact Exposure Assessment to 108 patients (age 69.5 ± 8.0, 35% female, 93% white or unknown race) diagnosed at the UCSF Memory and Aging Center with one of the following FTD or related conditions: behavioral variant frontotemporal dementia (N=39), semantic variant primary progressive aphasia (N=16), nonfluent variant PPA (N=23), corticobasal syndrome (N=14), or progressive supranuclear palsy (N=16). Data were also obtained from 217 controls (“HC”; age 76.8 ± 8.0, 53% female, 91% white or unknown race). CSP characteristics were defined based on width or “grade” (0-1 vs. 2+) and length of anterior-posterior separation (millimeters). We first describe frequency of any and multiple (2+) prior TBI based on different but commonly used definitions: TBI with loss of consciousness (LOC), TBI with LOC or posttraumatic amnesia (LOC/PTA), TBI with LOC/PTA or other symptoms like dizziness, nausea, “seeing stars,” etc. (“concussion”). TBI/concussion frequency was then compared between FTD and HC using chi-square. Associations between TBI/concussion and CSP characteristics were analyzed with chi-square (CSP grade) and Mann-Whitney U tests (CSP length). We explored sex differences due to typically higher rates of TBI among males.
Results:
History of any TBI with LOC (FTD=20.0%, HC=19.2%), TBI with LOC/PTA (FTD:32.2%, HC=31.5%), and concussion (FTD: 50.0%, HC=44.3%) was common but not different between study groups (p’s>.4). In both FTD and HC, prior TBI/concussion was nominally more frequent in males but not significantly greater than females. Frequency of repeat TBI/concussion (2+) also did not differ significantly between FTD and HC (repeat TBI with LOC: 6.7% vs. 3.3%, TBI with LOC/PTA: 12.2% vs. 10.3%, concussion: 30.2% vs. 28.7%; p’s>.2). Prior TBI/concussion was not significantly related to CSP grade or length in the total sample or within the FTD or HC groups.
Conclusions:
TBI/concussion rates depend heavily on the symptom definition used for classifying prior injury. Lifetime symptomatic TBI/concussion is common but has an unclear impact on risk for FTD-related diagnoses. Larger samples are needed to appropriately evaluate sex differences, to evaluate whether TBI/concussion rates differ between specific FTD phenotypes, and to understand the rates and effects of more extensive repetitive head trauma (symptomatic and asymptomatic) in patients with FTD.
Sleep problems associated with poor mental health and academic outcomes may have been exacerbated by the COVID-19 pandemic.
Aims
To describe sleep in undergraduate students during the COVID-19 pandemic.
Method
This longitudinal analysis included data from 9523 students over 4 years (2018–2022), associated with different pandemic phases. Students completed a biannual survey assessing risk factors, mental health symptoms and lifestyle, using validated measures. Sleep was assessed with the Sleep Condition Indicator (SCI-8). Propensity weights and multivariable log-binomial regressions were used to compare sleep in four successive first-year cohorts. Linear mixed-effects models were used to examine changes in sleep over academic semesters and years.
Results
There was an overall decrease in average SCI-8 scores, indicating worsening sleep across academic years (average change −0.42 per year; P-trend < 0.001), and an increase in probable insomnia at university entry (range 18.1–29.7%; P-trend < 0.001) before and up to the peak of the pandemic. Sleep improved somewhat in autumn 2021, when restrictions loosened. Students commonly reported daytime sleep problems, including mood, energy, relationships (36–48%) and concentration, productivity, and daytime sleepiness (54–66%). There was a consistent pattern of worsening sleep over the academic year. Probable insomnia was associated with increased cannabis use and passive screen time, and reduced recreation and exercise.
Conclusions
Sleep difficulties are common and persistent in students, were amplified by the pandemic and worsen over the academic year. Given the importance of sleep for well-being and academic success, a preventive focus on sleep hygiene, healthy lifestyle and low-intensity sleep interventions seems justified.
We present and evaluate the prospects for detecting coherent radio counterparts to gravitational wave (GW) events using Murchison Widefield Array (MWA) triggered observations. The MWA rapid-response system, combined with its buffering mode ($\sim$4 min negative latency), enables us to catch any radio signals produced from seconds prior to hours after a binary neutron star (BNS) merger. The large field of view of the MWA ($\sim$$1\,000\,\textrm{deg}^2$ at 120 MHz) and its location under the high sensitivity sky region of the LIGO-Virgo-KAGRA (LVK) detector network, forecast a high chance of being on-target for a GW event. We consider three observing configurations for the MWA to follow up GW BNS merger events, including a single dipole per tile, the full array, and four sub-arrays. We then perform a population synthesis of BNS systems to predict the radio detectable fraction of GW events using these configurations. We find that the configuration with four sub-arrays is the best compromise between sky coverage and sensitivity as it is capable of placing meaningful constraints on the radio emission from 12.6% of GW BNS detections. Based on the timescales of four BNS merger coherent radio emission models, we propose an observing strategy that involves triggering the buffering mode to target coherent signals emitted prior to, during or shortly following the merger, which is then followed by continued recording for up to three hours to target later time post-merger emission. We expect MWA to trigger on $\sim$$5-22$ BNS merger events during the LVK O4 observing run, which could potentially result in two detections of predicted coherent emission.
Children with congenital heart disease (CHD) can face neurodevelopmental, psychological, and behavioural difficulties beginning in infancy and continuing through adulthood. Despite overall improvements in medical care and a growing focus on neurodevelopmental screening and evaluation in recent years, neurodevelopmental disabilities, delays, and deficits remain a concern. The Cardiac Neurodevelopmental Outcome Collaborative was founded in 2016 with the goal of improving neurodevelopmental outcomes for individuals with CHD and pediatric heart disease. This paper describes the establishment of a centralised clinical data registry to standardize data collection across member institutions of the Cardiac Neurodevelopmental Outcome Collaborative. The goal of this registry is to foster collaboration for large, multi-centre research and quality improvement initiatives that will benefit individuals and families with CHD and improve their quality of life. We describe the components of the registry, initial research projects proposed using data from the registry, and lessons learned in the development of the registry.
This review traces the development of motivational interviewing (MI) from its happenstance beginnings and the first description published in this journal in 1983, to its continuing evolution as a method that is now in widespread practice in many professions, nations and languages. The efficacy of MI has been documented in hundreds of controlled clinical trials, and extensive process research sheds light on why and how it works. Developing proficiency in MI is facilitated by feedback and coaching based on observed practice after initial training. The author reflects on parallels between MI core processes and the characteristics found in 70 years of psychotherapy research to distinguish more effective therapists. This suggests that MI offers an evidence-based therapeutic style for delivering other treatments more effectively. The most common use of MI now is indeed in combination with other treatment methods such as cognitive behaviour therapies.
Recent research has shown that risk and reward are positively correlated in many environments, and that people have internalized this association as a “risk-reward heuristic”: when making choices based on incomplete information, people infer probabilities from payoffs and vice-versa, and these inferences shape their decisions. We extend this work by examining people’s expectations about another fundamental trade-off — that between monetary reward and delay. In 2 experiments (total N = 670), we adapted a paradigm previously used to demonstrate the risk-reward heuristic. We presented participants with intertemporal choice tasks in which either the delayed reward or the length of the delay was obscured. Participants inferred larger rewards for longer stated delays, and longer delays for larger stated rewards; these inferences also predicted people’s willingness to take the delayed option. In exploratory analyses, we found that older participants inferred longer delays and smaller rewards than did younger ones. All of these results replicated in 2 large-scale pre-registered studies with participants from a different population (total N = 2138). Our results suggest that people expect intertemporal choice tasks to offer a trade-off between delay and reward, and differ in their expectations about this trade-off. This “delay-reward heuristic” offers a new perspective on existing models of intertemporal choice and provides new insights into unexplained and systematic individual differences in the willingness to delay gratification.
Background: In March–April 2021, 23 patients at a 906-bed hospital in Delaware had surgical implantation of a bone graft product contaminated with Mycobacterium tuberculosis; 17 patients were rehospitalized for surgical site infections and 6 developed pulmonary tuberculosis. In May 2021, we investigated this tuberculosis outbreak and conducted a large, multidisciplinary, contact investigation among healthcare personnel (HCP) and patients potentially exposed over an extended period in multiple departments. Methods: Exposed HCP were those identified by their managers as present, without the use of airborne precautions, in operating rooms (ORs) during index spine surgeries or subsequent procedures, the postanesthesia care unit (PACU) when patients had draining wounds, inpatient rooms when wound care was performed, and the sterile processing department (SPD) on the days repeated surgeries were performed. We created and assigned an online education module and symptom screening questionnaire to exposed HCP. Employee health services (EHS) instituted a dedicated phlebotomy station to provide interferon-γ release assay (IGRA) testing for HCP at ≥8 weeks after last known exposure. EHS managed all exposed HCP, including nonemployees (eg, private surgeons) via automated e-mail reminders, which were escalated through supervisory chains as needed until follow-up completion. The infection prevention team notified exposed patients, defined as those who shared semiprivate rooms with case patients with transmissible tuberculosis. The Delaware Division of Public Health performed IGRA testing. Results: There were 506 exposed HCP in ORs (n = 100), the PACU (n = 87), inpatient units (n = 140), the SPD (n = 54), and other locations (n = 122); 83% were employed by the health system. Surgical masks and eye protection were routinely used during patient care. All exposed HCP completed screening by December 17, 2021. Furthermore, 2 HCP had positive IGRAs without symptoms or chest radiograph abnormalities, indicating latent tuberculosis infection, but after further review of records and interviews, we discovered that they had previously tested positive and had been treated for latent tuberculosis infection. In addition, 5 exposed patients tested negative and 2 remain pending. Conclusions: This large investigation demonstrated the need for a systematic process that encompassed all exposed HCP including nonemployees and incorporated administrative controls to ensure complete follow-up. We did not identify any conversions related to this outbreak despite high burden of disease in case patients and multiple exposures to contaminated bone-graft material and infectious bodily fluids without respirator use. Transmission risk was likely reduced by baseline surgical mask use and rapid institution of airborne precautions after outbreak recognition.
Praziquantel (PZQ) remains the only drug of choice for the treatment of schistosomiasis, caused by parasitic flatworms. The widespread use of PZQ in schistosomiasis endemic areas for about four decades raises concerns about the emergence of resistance of Schistosoma spp. to PZQ under drug selection pressure. This reinforces the urgency in finding alternative therapeutic options that could replace or complement PZQ. We explored the potential of medicinal plants commonly used by indigenes in Kenya for the treatment of various ailments including malaria, pneumonia, and diarrhoea for their antischistosomal properties. Employing the Soxhlet extraction method with different solvents, seven medicinal plants Artemisia annua, Ajuga remota, Bredilia micranta, Cordia africana, Physalis peruviana, Prunus africana and Senna didymobotrya were extracted. Qualitative phytochemical screening was performed to determine the presence of various phytochemicals in the plant extracts. Extracts were tested against Schistosoma mansoni newly transformed schistosomula (NTS) and adult worms and the schistosomicidal activity was determined by using the adenosine triphosphate quantitation assay. Phytochemical analysis of the extracts showed different classes of compounds such as alkaloids, tannins, terpenes, etc., in plant extracts active against S. mansoni worms. Seven extracts out of 22 resulted in <20% viability against NTS in 24 h at 100 μg/ml. Five of the extracts with inhibitory activity against NTS showed >69.7% and ≥72.4% reduction in viability against adult worms after exposure for 24 and 48 h, respectively. This study provides encouraging preliminary evidence that extracts of Kenyan medicinal plants deserve further study as potential alternative therapeutics that may form the basis for the development of the new treatments for schistosomiasis.
Many short gamma-ray bursts (GRBs) originate from binary neutron star mergers, and there are several theories that predict the production of coherent, prompt radio signals either prior, during, or shortly following the merger, as well as persistent pulsar-like emission from the spin-down of a magnetar remnant. Here we present a low frequency (170–200 MHz) search for coherent radio emission associated with nine short GRBs detected by the Swift and/or Fermi satellites using the Murchison Widefield Array (MWA) rapid-response observing mode. The MWA began observing these events within 30–60 s of their high-energy detection, enabling us to capture any dispersion delayed signals emitted by short GRBs for a typical range of redshifts. We conducted transient searches at the GRB positions on timescales of 5 s, 30 s, and 2 min, resulting in the most constraining flux density limits on any associated transient of 0.42, 0.29, and 0.084 Jy, respectively. We also searched for dispersed signals at a temporal and spectral resolution of 0.5 s and 1.28 MHz, but none were detected. However, the fluence limit of 80–100 Jy ms derived for GRB 190627A is the most stringent to date for a short GRB. Assuming the formation of a stable magnetar for this GRB, we compared the fluence and persistent emission limits to short GRB coherent emission models, placing constraints on key parameters including the radio emission efficiency of the nearly merged neutron stars (
$\epsilon_r\lesssim10^{-4}$
), the fraction of magnetic energy in the GRB jet (
$\epsilon_B\lesssim2\times10^{-4}$
), and the radio emission efficiency of the magnetar remnant (
$\epsilon_r\lesssim10^{-3}$
). Comparing the limits derived for our full GRB sample (along with those in the literature) to the same emission models, we demonstrate that our fluence limits only place weak constraints on the prompt emission predicted from the interaction between the relativistic GRB jet and the interstellar medium for a subset of magnetar parameters. However, the 30-min flux density limits were sensitive enough to theoretically detect the persistent radio emission from magnetar remnants up to a redshift of
$z\sim0.6$
. Our non-detection of this emission could imply that some GRBs in the sample were not genuinely short or did not result from a binary neutron star merger, the GRBs were at high redshifts, these mergers formed atypical magnetars, the radiation beams of the magnetar remnants were pointing away from Earth, or the majority did not form magnetars but rather collapse directly into black holes.