We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Targeting the glutamatergic system is posited as a potentially novel therapeutic strategy for psychotic disorders. While studies in subjects indicate that antipsychotic medication reduces brain glutamatergic measures, they were unable to disambiguate clinical changes from drug effects.
Aims
To address this, we investigated the effects of a dopamine D2 receptor partial agonist (aripiprazole) and a dopamine D2 receptor antagonist (amisulpride) on glutamatergic metabolites in the anterior cingulate cortex (ACC), striatum and thalamus in healthy controls.
Method
A double-blind, within-subject, cross-over, placebo-controlled study design with two arms (n = 25 per arm) was conducted. Healthy volunteers received either aripiprazole (up to 10 mg/day) for 7 days or amisulpride (up to 400 mg/day) and a corresponding period of placebo treatment in a pseudo-randomised order. Magnetic resonance spectroscopy (1H-MRS) was used to measure glutamatergic metabolite levels and was carried out at three different time points: baseline, after 1 week of drug and after 1 week of placebo. Values were analysed as a combined measure across the ACC, striatum and thalamus.
Results
Aripiprazole significantly increased glutamate + glutamine (Glx) levels compared with placebo (β = 0.55, 95% CI [0.15, 0.95], P = 0.007). At baseline, the mean Glx level was 8.14 institutional units (s.d. = 2.15); following aripiprazole treatment, the mean Glx level was 8.16 institutional units (s.d. = 2.40) compared with 7.61 institutional units (s.d. = 2.36) for placebo. This effect remained significant after adjusting for plasma parent and active metabolite drug levels. There was an observed increase with amisulpride that did not reach statistical significance.
Conclusions
One week of aripiprazole administration in healthy participants altered brain Glx levels as compared with placebo administration. These findings provide novel insights into the relationship between antipsychotic treatment and brain metabolites in a healthy participant cohort.
We conducted an analysis of a nationwide survey of US physician offices between 2016 and 2019 and calculated annualized prevalence rates of urinary tract infections (UTIs). During the 3-year study period, UTI was the most common infection in US physician offices, accounting for approximately 10 million annualized encounters.
To improve its management capacity, Frontiers Clinical and Translational Science Institute overhauled its evaluation infrastructure to be comprehensive, efficient, and transparent in demonstrating outputs and outcomes. We built a platform that standardized measures across program areas, integrated continuous improvement processes, and reduced the data entry burden for investigators. Using the Utilization-Focused Evaluation Framework, we created logic models to identify appropriate metrics. We built the evaluation data platform within REDCap to capture requests, events, attendance, and outcomes and to push work processes to Navigators. We initiated a membership model to serve as the backbone of the platform which allowed tailored communication, demographic data capture, and reduced data entry burden. The platform consists of nine REDCap projects across multiple programmatic areas. Using REDCap Dynamic SQL query fields and External Modules, the membership module was integrated into all forms to check and collect membership before service access. Data is synched to a dashboard for tracking outputs and outcomes in real-time. Since the launch of the evaluation platform in Fall 2022, Frontiers has increased its workflow efficiency and streamlined continuous improvement communication. The platform can serve as a model for other hubs to build efficient processes to create comprehensive and transparent evaluation plans.
The growing popularity of home-sharing platforms such as Airbnb, partly fueled by hosts’ ability to evade local taxes and regulations, has been shown to elevate housing costs by reallocating long-term housing units to the short-term rental market. This study assesses whether enhanced tax enforcement can mitigate this trend. We analyze staggered tax collection agreements between Airbnb and Florida counties, wherein Airbnb collects taxes from the hosts directly. Using a difference-in-differences methodology, we find these agreements significantly slow the growth of housing costs, highlighting the importance of tax policy in addressing the sharing economy’s influence on housing affordability.
Recruitment of representative and generalizable adult samples is a major challenge for researchers conducting economic field experiments. Limited access to representative samples or the high cost of obtaining them often leads to the recruitment of non-representative convenience samples. This research compares the findings from two field experiments involving 860 adults: one from a non-representative in-person convenience sample and one from a representative online counterpart. We find no meaningful differences in the key behaviors of interest between the two samples. These findings contribute to a growing body of literature demonstrating that non-representative convenience samples can be sufficient in certain contexts.
Ward round quality is a pivotal component of surgical care and is intimately associated with patient outcomes. Despite this, ward rounds remain largely understudied and underrepresented in medical literature. Accurate and thorough ward round documentation is known to improve communication and patient outcomes and to reduce hospital expenditure. This study aimed to determine the accuracy of ward round documentation.
Methods
A prospective observational cohort study was performed as a sub-analysis of a larger study by reviewing 135 audiovisual recordings of surgical ward rounds over two years at two hospitals. The recordings were transcribed verbatim, and content was designated a level of importance by an external reviewer. This was then compared to the written case notes to determine the accuracy and importance of omitted documentation. Patient age, sex, and length of stay, as well as the senior doctor leading and the intern documenting the ward round, were assessed using multivariable linear mixed-effect models to determine their impact on documentation accuracy.
Results
Nearly one-third (32.4%) of spoken information on the surgical ward round that was deemed “important”, including discharge plans and bookings for surgery, was absent from the patients’ electronic medical records. Additionally, in 11 percent of case notes there was a major conflict between the ward round discussion and what was documented. Younger patients (p=0.04) and patients who had been on the ward longer (p=0.005) were less likely to have accurate documentation. Some interns were significantly worse at documenting discussions than were others (p<0.0001). Day of the week, location, and the senior doctor present did not affect documentation accuracy.
Conclusions
This study demonstrates that a significant amount of important discussion during surgical ward rounds regarding patient care is not recorded accurately, or at all, in the patient medical record. This can lead to preventable patient complications and longer hospital stays, resulting in increased strain on hospital resources. This study emphasizes the need for further research to address this problem.
Special education enrollment increased in Flint following the 2014–2015 Flint Water Crisis, but lead exposure is not plausibly responsible. Labeling Flint children as lead poisoned and/or brain damaged may have contributed to rising special education needs (ie, nocebo effect). To better document this possibility, we surveyed schoolteachers and reviewed neuropsychological assessments of children for indications of negative labeling.
Methods
A survey of Flint and Detroit (control) public schoolteachers using a modified Illness Perception Questionnaire was conducted 5 years post-crisis. We also examined neuropsychological assessments from a recently settled class lawsuit.
Results
Relative to Detroit (n = 24), Flint teachers (n = 11) believed that a higher proportion of their students had harmful lead exposure (91.8% Flint vs 46% Detroit; P = 0.00034), were lead poisoned (51.3% vs 24.3%; P = 0.018), or brain damaged (28.8% vs 12.9%; P = 0.1), even though blood lead of Flint children was always less than half of that of Detroit children. Neuropsychological assessments diagnosed lead poisoning and/or brain damage from water lead exposure in all tested children (n = 8), even though none had evidence of elevated blood lead and a majority had prior learning disability diagnoses.
Conclusion
Teachers’ responses and neuropsychological assessments suggest Flint children were harmed by a nocebo effect.
Functional near-infrared spectroscopy (fNIRS) is a non-invasive functional neuroimaging method that takes advantage of the optical properties of hemoglobin to provide an indirect measure of brain activation via task-related relative changes in oxygenated hemoglobin (HbO). Its advantage over fMRI is that fNIRS is portable and can be used while walking and talking. In this study, we used fNIRS to measure brain activity in prefrontal and motor region of interests (ROIs) during single- and dual-task walking, with the goal of identifying neural correlates.
Participants and Methods:
Nineteen healthy young adults [mean age=25.4 (SD=4.6) years; 14 female] engaged in five tasks: standing single-task cognition (serial-3 subtraction); single-task walking at a self-selected comfortable speed on a 24.5m oval-shaped course (overground walking) and on a treadmill; and dual-task cognition+walking on the same overground course and treadmill (8 trials/condition: 20 seconds standing rest, 30 seconds task). Performance on the cognitive task was quantified as the number of correct subtractions, number of incorrect subtractions, number of self-corrected errors, and percent accuracy over the 8 trials. Walking speed (m/sec) was recorded for all walking conditions. fNIRS data were collected on a system consisting of 16 sources, 15 detectors, and 8 short-separation detectors in the following ROIs: right and left lateral frontal (RLF, LLF), right and left medial frontal (RMF, LMF), right and left medial superior frontal (RMSF, LMSF), and right and left motor (RM, LM). Lateral and medial refer to ROIs’ relative positions on lateral prefrontal cortex. fNIRS data were analyzed in Homer3 using a spline motion correction and the iterative weighted least squares method in the general linear model. Correlations between the cognitive/speed variables and ROI HbO data were applied using a Bonferroni adjustment for multiple comparisons.
Results:
Subjects with missing cognitive data were excluded from analyses, resulting in sample sizes of 18 for the single-task cognition, dual-task overground walking, and dual-task treadmill walking conditions. During dual-task overground walking, there was a significant positive correlation between walking speed and relative change in HbO in RMSF [r(18)=.51, p<.05] and RM [r(18)=.53, p<.05)]. There was a significant negative correlation between total number of correct subtractions and relative change in HbO in LMSF ([r(18)=-.75, p<.05] and LM [r(18)=-.52, p<.05] during dual-task overground walking. No other significant correlations were identified.
Conclusions:
These results indicate that there is lateralization of the cognitive and motor components of overground dual-task walking. The right hemisphere appears to be more active the faster people walk during the dual-task. By contrast, the left hemisphere appears to be less active when people are working faster on the cognitive task (i.e., serial-3 subtraction). The latter results suggest that automaticity of the cognitive task (i.e., more total correct subtractions) is related to decreased brain activity in the left hemisphere. Future research will investigate whether there is a change in cognitive automaticity over trials and if there are changes in lateralization patterns in neurodegenerative disorders that are known to differentially affect the hemispheres (e.g., Parkinson’s disease).
The COVID-19 pandemic has disproportionally affected traditionally marginalized groups. Both the Delta and Omicron variants raised concern amongst public health officials due to potentially higher infectivity rates and disease severity than prior variants. This study sought to compare disease severity between adults infected with the Omicron variant and adults infected with the Delta variant who presented to the Emergency Department at an academic, safety-net hospital in Virginia.
Methods:
This retrospective cohort study used electronic medical record data of patients who presented to the Emergency Department and received a positive SARS-CoV-2 test between September 1, 2021, and January 31, 2022. Positive tests were stratified by genotypic variant through whole genome sequencing. Participants with the Omicron variant were propensity scores matched with individuals with the Delta variant.
Results:
Among 500 Delta and 500 Omicron participants, 279 propensity score-matched pairs were identified. Participants were predominantly unvaccinated, with medical comorbidities, and self-identified as Black. Individuals infected with the Delta variant had more severe disease compared to those with the Omicron variant, regardless of vaccination status. Patients with kidney, liver, and respiratory disease, as well as cancer, are at higher risk for severe disease. Patients with 2 doses of COVID-19 immunization trended toward less severe disease.
Conclusions:
Overall, these data further support the literature regarding the disproportionate effects of the COVID-19 pandemic on vulnerable patient populations – such as those with limited access to care, people of color, and those with chronic medical conditions – and can be used to inform public health interventions.
Cross-species evidence suggests that the ability to exert control over a stressor is a key dimension of stress exposure that may sensitize frontostriatal-amygdala circuitry to promote more adaptive responses to subsequent stressors. The present study examined neural correlates of stressor controllability in young adults. Participants (N = 56; Mage = 23.74, range = 18–30 years) completed either the controllable or uncontrollable stress condition of the first of two novel stressor controllability tasks during functional magnetic resonance imaging (fMRI) acquisition. Participants in the uncontrollable stress condition were yoked to age- and sex-matched participants in the controllable stress condition. All participants were subsequently exposed to uncontrollable stress in the second task, which is the focus of fMRI analyses reported here. A whole-brain searchlight classification analysis revealed that patterns of activity in the right dorsal anterior insula (dAI) during subsequent exposure to uncontrollable stress could be used to classify participants' initial exposure to either controllable or uncontrollable stress with a peak of 73% accuracy. Previous experience of exerting control over a stressor may change the computations performed within the right dAI during subsequent stress exposure, shedding further light on the neural underpinnings of stressor controllability.
The NIH Center for Accelerated Innovations at Cleveland Clinic (NCAI-CC) was funded by the National Heart Lung and Blood Institute (NHLBI) to support academic investigators in technology development and commercialization. NCAI-CC was one of three multi-institutional Centers established in the fall of 2013. The goal of each Center was to catalyze the growth of an ecosystem of commercialization within their affiliated institutions and regions by managing a program of funding and guiding translational project development and by delivering commercialization education programs to participating investigators. NCAI-CC created and managed such a funding program, ultimately supporting 75 different projects across seven separate academic institutions and developed tailored educational content following the National Science Foundation I-Corps™ curriculum and delivered the program to 79 teams from 12 institutions. We determined early on that in establishment and implementation of projects, it is important to support the teams and principal investigators throughout the program. The support includes a change in principal investigator mindset from specific aims orientation to goals and deliverables on projects. Our skills development efforts emphasized commercialization and a deep understanding of customer needs for new technology adoption. Here, we review our experiences, outcomes, and insights, including the challenges identified in program implementation.
Anticholinergic medications block cholinergic transmission. The central effects of anticholinergic drugs can be particularly marked in patients with dementia. Furthermore, anticholinergics antagonise the effects of cholinesterase inhibitors, the main dementia treatment.
Objectives
This study aimed to assess anticholinergic drug prescribing among dementia patients before and after admission to UK acute hospitals.
Methods
352 patients with dementia were included from 17 hospitals in the UK. All were admitted to surgical, medical or Care of the Elderly wards in 2019. Information about patients’ prescriptions were recorded on a standardised form. An evidence-based online calculator was used to calculate the anticholinergic drug burden of each patient. The correlation between two subgroups upon admission and discharge was tested with Spearman’s Rank Correlation.
Results
Table 1 shows patient demographics. On admission, 37.8% of patients had an anticholinergic burden score ≥1 and 5.68% ≥3. At discharge, 43.2% of patients had an anticholinergic burden score ≥1 and 9.1% ≥3. The increase was statistically significant (rho 0.688; p=2.2x10-16). The most common group of anticholinergic medications prescribed at discharge were psychotropics (see Figure 1). Among patients prescribed cholinesterase inhibitors, 44.9% were also taking anticholinergic medications.
Conclusions
This multicentre cross-sectional study found that people with dementia are frequently prescribed anticholinergic drugs, even if also taking cholinesterase inhibitors, and are significantly more likely to be discharged with a higher anticholinergic drug burden than on admission to hospital.
Conflict of interest
This project was planned and executed by the authors on behalf of SPARC (Student Psychiatry Audit and Research Collaborative). We thank the National Student Association of Medical Research for allowing us use of the Enketo platform. Judith Harrison was su
Recent cannabis exposure has been associated with lower rates of neurocognitive impairment in people with HIV (PWH). Cannabis’s anti-inflammatory properties may underlie this relationship by reducing chronic neuroinflammation in PWH. This study examined relations between cannabis use and inflammatory biomarkers in cerebrospinal fluid (CSF) and plasma, and cognitive correlates of these biomarkers within a community-based sample of PWH.
Methods:
263 individuals were categorized into four groups: HIV− non-cannabis users (n = 65), HIV+ non-cannabis users (n = 105), HIV+ moderate cannabis users (n = 62), and HIV+ daily cannabis users (n = 31). Differences in pro-inflammatory biomarkers (IL-6, MCP-1/CCL2, IP-10/CXCL10, sCD14, sTNFR-II, TNF-α) by study group were determined by Kruskal–Wallis tests. Multivariable linear regressions examined relationships between biomarkers and seven cognitive domains, adjusting for age, sex/gender, race, education, and current CD4 count.
Results:
HIV+ daily cannabis users showed lower MCP-1 and IP-10 levels in CSF compared to HIV+ non-cannabis users (p = .015; p = .039) and were similar to HIV− non-cannabis users. Plasma biomarkers showed no differences by cannabis use. Among PWH, lower CSF MCP-1 and lower CSF IP-10 were associated with better learning performance (all ps < .05).
Conclusions:
Current daily cannabis use was associated with lower levels of pro-inflammatory chemokines implicated in HIV pathogenesis and these chemokines were linked to the cognitive domain of learning which is commonly impaired in PWH. Cannabinoid-related reductions of MCP-1 and IP-10, if confirmed, suggest a role for medicinal cannabis in the mitigation of persistent inflammation and cognitive impacts of HIV.
Critical migration studies emerged to trace how restrictive immigration contexts contribute to conditions of migrant “illegality” and deportability. More recently, researchers have turned to examine diversity in migrants’ experiences, revealing how migrant “illegality” and deportability can take varied forms based on different social factors, including migrants’ immigration status, developmental stage, ethno-racial background, gender, and nationality. Yet, despite increasingly nuanced and contextualized accounts of migrants’ lived experiences, the psychology of migrant “illegality” remains under-theorized, as we lack general concepts and frameworks to explain how deportability shapes, and is shaped by, migrants’ psychosocial lives. This article introduces such a framework by drawing upon findings from two ethnographic studies with undocumented migrants in Canada and the United States. Observing common psychosocial patterns in both groups, I propose cycles of deportability as a framework to capture how migrant “illegality” develops at the psychosocial level through repeated occurrences of status-related stressors, which produce both acute and chronic fears that further require distinct agencies and coping strategies. Next, I examine differences in migrants’ cycles of deportability based on their national context and immigrant generation. I conclude by discussing how this framework can help consolidate previous research findings and guide future psychological and critical migration studies.
Hemiparetic walking after stroke is typically slow, asymmetric, and inefficient, significantly impacting activities of daily living. Extensive research shows that functional, intensive, and task-specific gait training is instrumental for effective gait rehabilitation, characteristics that our group aims to encourage with soft robotic exosuits. However, standard clinical assessments may lack the precision and frequency to detect subtle changes in intervention efficacy during both conventional and exosuit-assisted gait training, potentially impeding targeted therapy regimes. In this paper, we use exosuit-integrated inertial sensors to reconstruct three clinically meaningful gait metrics related to circumduction, foot clearance, and stride length. Our method corrects sensor drift using instantaneous information from both sides of the body. This approach makes our method robust to irregular walking conditions poststroke as well as usable in real-time applications, such as real-time movement monitoring, exosuit assistance control, and biofeedback. We validate our algorithm in eight people poststroke in comparison to lab-based optical motion capture. Mean errors were below 0.2 cm (9.9%) for circumduction, −0.6 cm (−3.5%) for foot clearance, and 3.8 cm (3.6%) for stride length. A single-participant case study shows our technique’s promise in daily-living environments by detecting exosuit-induced changes in gait while walking in a busy outdoor plaza.
Lusala (Dioscorea hirtiflora Benth. subsp. pedicellata Milne-Redh) is an important wild edible tuber foraged widely from natural forests in Southern Zambia, but at risk from overharvesting and deforestation. Its propagation was investigated in glasshouse studies to explore potential domestication and future in situ and ex situ genetic resources conservation. Almost all tubers planted with visible shoot buds produced vines, with no effect of tuber size on vine emergence or tuber yield. Few tubers without visible shoot buds at planting produced vines, but those that did not re-tuberized. The progeny provided good vine emergence and similar tuber yield, with vines from tubers produced by re-tuberization being more vigorous. Re-tuberization in the absence of vine emergence also occurred in other experiments. Minisetts cut from the proximal end of tubers provided better vine emergence (with more from 20-mm than 10-mm-long sections) and greater tuber yield than mid- or distal minisetts. Nodal stem cuttings rooted well, vined, and provided small tubers. This study shows that lusala can be propagated successfully from tubers, minisetts, nodal vine cuttings, or mini-tubers from nodal vine cuttings, for genetic resources conservation and/or domestication. Domestication is likely to be hampered by the long period required for vines to emerge and establish. More sustainable foraging, including re-planting in natural forests, is recommended to balance consumption of lusala in the region and promote its long-term conservation.
Introduction: The Maximizing Aging Using Volunteer Engagement in the ED (MAUVE + ED) program connects specially trained volunteers with older patients whose personal and social needs are not always met within the busy ED environment. The objective of this study was to describe the development and implementation of the MAUVE + ED program and the activities performed with older patients by its volunteers. Methods: The MAUVE + ED program was implemented in the ED (annual census 65,000) of a large academic tertiary hospital in Toronto, Ontario. Volunteers were trained to identify and approach older patients and others at greater risk for adverse outcomes, including poor patient experience, in the ED and invite such patients to participate in the program. The program is available to all patients >65 years, and those with confusion, patients who were alone, those with mobility issues, and patients with increased length of stay in the ED. Volunteers documented their activities after each patient encounter using a standardized paper-based data collection form. Results: Over the program's initial 6-month period, the MAUVE + ED volunteers reported a total of 896 encounters with 718 unique patients. The median (IQR) time a MAUVE volunteer spent with a patient was 10 (5, 20) minutes, with a range of 1 to 130 minutes. The median (IQR) number of patients seen per shift was 7 (6, 9), with a range of 1 to 16 patients per shift. The most common activities the volunteer assisted with were therapeutic activities/social visits (n = 859; 95.9%), orientation activities (n = 501; 55.9%), and hydration assistance (n = 231; 25.8%). The least common were mobility assistance (n = 36; 4.0%), and vision/hearing assistance (n = 13; 1.5%). Conclusion: Preliminary data suggest the MAUVE + ED volunteers were able to enrich the experience of older adults and their families/carers in the ED.
The short allele of the serotonin transporter gene 5’ promoter region polymorphism (5-HTTLPR) is reported by A. Caspi and others to be associated with susceptibility to depression and suicidality in response to stressful life events. We examined the relationship of a triallelic 5-HTTLPR polymorphism to stressful life events (SLE) and severity of major depression and suicidality.
Method
Mood disorder subjects (N=191) and healthy volunteers (N=125), all Caucasians of European origin, were genotyped for the triallelic 5-HTTLPR polymorphism, two low expressing alleles (LG, S) and a higher expressing LA allele. All subjects underwent structured clinical interviews for DSM IV diagnoses, ratings of psychopathology, stressful life events, developmental history and suicidal behavior. Cerebrospinal fluid (CSF) 5-HIAA was assayed in a sub-sample.
Results
Lower expressing alleles independently predicted greater depression severity and predicted greater severity of major depression with moderate-severe life events compared with the LA allele. No associations with suicidal behavior and CSF 5-HIAA were found.
Conclusions
Low expression transporter alleles explain 31% of the variance in major depression severity and increase the impact of stressful life events on severity. The biological phenotype responsible for these effects remains to be elucidated
Tourniquets (TQs) save lives. Although military-approved TQs appear more effective than improvised TQs in controlling exsanguinating extremity hemorrhage, their bulk may preclude every day carry (EDC) by civilian lay-providers, limiting availability during emergencies.
Study Objective:
The purpose of the current study was to compare the efficacy of three novel commercial TQ designs to a military-approved TQ.
Methods:
Nine Emergency Medicine residents evaluated four different TQ designs: Gen 7 Combat Application Tourniquet (CAT7; control), Stretch Wrap and Tuck Tourniquet (SWAT-T), Gen 2 Rapid Application Tourniquet System (RATS), and Tourni-Key (TK). Popliteal artery flow cessation was determined using a ZONARE ZS3 ultrasound. Steady state maximal generated force was measured for 30 seconds with a thin-film force sensor.
Results:
Success rates for distal arterial flow cessation were 89% CAT7; 67% SWAT-T; 89% RATS; and 78% TK (H 0.89; P = .83). Mean (SD) application times were 10.4 (SD = 1.7) seconds CAT7; 23.1 (SD = 9.0) seconds SWAT-T; 11.1 (SD = 3.8) seconds RATS; and 20.0 (SD = 7.1) seconds TK (F 9.71; P <.001). Steady state maximal forces were 29.9 (SD = 1.2) N CAT7; 23.4 (SD = 0.8) N SWAT-T; 33.0 (SD = 1.3) N RATS; and 41.9 (SD = 1.3) N TK.
Conclusion:
All novel TQ systems were non-inferior to the military-approved CAT7. Mean application times were less than 30 seconds for all four designs. The size of these novel TQs may make them more conducive to lay-provider EDC, thereby increasing community resiliency and improving the response to high-threat events.