We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To determine the relationship between severe acute respiratory syndrome coronavirus 2 infection, hospital-acquired infections (HAIs), and mortality.
Design:
Retrospective cohort.
Setting:
Three St. Louis, MO hospitals.
Patients:
Adults admitted ≥48 hours from January 1, 2017 to August 31, 2020.
Methods:
Hospital-acquired infections were defined as those occurring ≥48 hours after admission and were based on positive urine, respiratory, and blood cultures. Poisson interrupted time series compared mortality trajectory before (beginning January 1, 2017) and during the first 6 months of the pandemic. Multivariable logistic regression models were fitted to identify risk factors for mortality in patients with an HAI before and during the pandemic. A time-to-event analysis considered time to death and discharge by fitting Cox proportional hazards models.
Results:
Among 6,447 admissions with subsequent HAIs, patients were predominantly White (67.9%), with more females (50.9% vs 46.1%, P = .02), having slightly lower body mass index (28 vs 29, P = .001), and more having private insurance (50.6% vs 45.7%, P = .01) in the pre-pandemic period. In the pre-pandemic era, there were 1,000 (17.6%) patient deaths, whereas there were 160 deaths (21.3%, P = .01) during the pandemic. A total of 53 (42.1%) coronavirus disease 2019 (COVID-19) patients died having an HAI. Age and comorbidities increased the risk of death in patients with COVID-19 and an HAI. During the pandemic, Black patients with an HAI and COVID-19 were more likely to die than White patients with an HAI and COVID-19.
Conclusions:
In three Midwestern hospitals, patients with concurrent HAIs and COVID-19 were more likely to die if they were Black, elderly, and had certain chronic comorbidities.
What influenced women to volunteer for service in the US military during World War II? Whereas previous literature focused on potential intrinsic and extrinsic individual-level motives, we consider the broader structural context that may have played a role in female volunteerism. We leverage original data containing information on all volunteers who served in the US Army during World War II, with detailed county-level economic, political, and demographic data, to explore patterns of female volunteerism in the military. Our findings suggest that racism and sexism played a role in female volunteerism in many parts of the country, which may have undermined the government’s goals of mobilizing the entire country in support of the war effort.
While regular monitoring of stun quality in abattoirs is now required by EU law, guidelines specific to species and stun method have not been adequately developed. Carbon dioxide (CO2) gas stunning of pigs in groups is widely used because of efficiency and reduced pre-slaughter stress. However, some pigs may recover from the stun process if it is not correctly managed. In light of these concerns, this study aimed to develop and implement a standardised assessment for stun quality for use in commercial pig abattoirs. Eight abattoirs and 9,520 slaughter pigs were assessed for stun group size, stick time and stun quality. The stun system, CO2 concentrations and exposure times were also investigated. A stun-quality protocol (SQP) identified and risk-rated symptoms signifying recovery of consciousness. In abattoirs using paternoster stun-boxes, pigs consistently showed no stun-quality problems despite 65% with stick times between 70 and 100 s. Stun-quality problems were detected in 1.7 to 3.3% of pigs in abattoirs using dip-lift stun-boxes and 75% of stick times were below 60 s. In 36 of 38 cases of inadequately stunned pigs, a combination of symptoms from the SQP was seen. Regular gasping preceded other symptoms in 31 cases and was a valid indicator of inadequate stunning. In response to the stun-quality assessments, two abattoirs serviced the stun machines (increasing CO2 concentrations and exposure times). All pigs were adequately stunned in follow-up studies. Implementation of stun-quality assessments, such as developed in this study, can assure monitoring of animal welfare at slaughter, beneficial not only to the industry and relevant authorities but also the concerned consumer.
Cattle may suffer pain and distress if incorrectly stunned. Regular monitoring of stun quality in abattoirs is now required by EU law. This study aimed to assess stun quality in cattle slaughtered under commercial conditions. A stun protocol was developed to evaluate when inadequate stunning occurred. This included rating of identified symptoms into three levels from highest to lowest risk for inferior animal welfare. Stun to stick interval times, shot accuracy, repeat shots, and stun quality variations between different cattle classes and by different shooters was also investigated. A total of 585 bulls and 413 other cattle classes (306 cows, 58 steers and 49 calves) were studied. Inadequate stunning occurred in 12.5% (16.7% of bulls, compared with 6.5% other cattle). Bulls displayed symptoms rated the highest level for inferior stun quality three times more frequently than other cattle. Despite being shot accurately, 13.6% bulls were inadequately stunned compared with 3.8% other cattle. Twelve percent of cattle were reshot, and 8% were inaccurately shot. Calves were shot inaccurately more frequently (14%) than other cattle. Percentage of cattle shot inaccurately ranged from 19% for the least experienced shooter to 5% for the most experienced. Stun to stick times averaged 105 (± 17) s posing questions for animal welfare, considering the number of cattle inadequately stunned. Stun quality could be improved by using more powerful stunners for shooting bulls, regular servicing of weapons, and use of neck restraints to improve shot accuracy. This study highlights the importance of external monitoring of stun quality at slaughter.
From January 1, 2018, until July 31, 2020, our hospital network experienced an outbreak of vancomycin-resistant enterococci (VRE). The goal of our study was to improve existing processes by applying machine-learning and graph-theoretical methods to a nosocomial outbreak investigation.
Methods:
We assembled medical records generated during the first 2 years of the outbreak period (January 2018 through December 2019). We identified risk factors for VRE colonization using standard statistical methods, and we extended these with a decision-tree machine-learning approach. We then elicited possible transmission pathways by detecting commonalities between VRE cases using a graph theoretical network analysis approach.
Results:
We compared 560 VRE patients to 86,684 controls. Logistic models revealed predictors of VRE colonization as age (aOR, 1.4 (per 10 years), with 95% confidence interval [CI], 1.3–1.5; P < .001), ICU admission during stay (aOR, 1.5; 95% CI, 1.2–1.9; P < .001), Charlson comorbidity score (aOR, 1.1; 95% CI, 1.1–1.2; P < .001), the number of different prescribed antibiotics (aOR, 1.6; 95% CI, 1.5–1.7; P < .001), and the number of rooms the patient stayed in during their hospitalization(s) (aOR, 1.1; 95% CI, 1.1–1.2; P < .001). The decision-tree machine-learning method confirmed these findings. Graph network analysis established 3 main pathways by which the VRE cases were connected: healthcare personnel, medical devices, and patient rooms.
Conclusions:
We identified risk factors for being a VRE carrier, along with 3 important links with VRE (healthcare personnel, medical devices, patient rooms). Data science is likely to provide a better understanding of outbreaks, but interpretations require data maturity, and potential confounding factors must be considered.
Background: Poorly-defined cases (PDCs) of focal epilepsy are cases with no/subtle MRI abnormalities or have abnormalities extending beyond the lesion visible on MRI. Here, we evaluated the utility of Arterial Spin Labeling (ASL) MRI perfusion in PDCs of pediatric focal epilepsy. Methods: ASL MRI was obtained in 25 consecutive children presenting with poorly-defined focal epilepsy (20 MRI- positive, 5 MRI-negative). Qualitative visual inspection and quantitative analysis with asymmetry and Z-score maps were used to detect perfusion abnormalities. ASL results were compared to the hypothesized epileptogenic zone (EZ) derived from other clinical/imaging data and the resection zone in patients with Engel I/II outcome and >18 month follow-up. Results: Qualitative analysis revealed perfusion abnormalities in 17/25 total cases (68%), 17/20 MRI-positive cases (85%) and none of the MRI-negative cases. Quantitative analysis confirmed all cases with abnormalities on qualitative analysis, but found 1 additional true-positive and 4 false-positives. Concordance with the surgically-proven EZ was found in 10/11 cases qualitatively (sensitivity=91%, specificity=50%), and 11/11 cases quantitatively (sensitivity=100%, specificity=23%). Conclusions: ASL perfusion may support the hypothesized EZ, but has limited localization benefit in MRI-negative cases. Nevertheless, owing to its non-invasiveness and ease of acquisition, ASL could be a useful addition to the pre-surgical MRI evaluation of pediatric focal epilepsy.
Recent research suggests that many faculty members believe that their students are lacking the information literacy (IL) skills needed to be successful in their college career. Reports also suggest that there is a broader issue about the uncertain position of IL in the university curriculum. This article uses data from a worldwide survey of political science faculty to better understand how widespread this perception is, what is being done about this perceived problem, and what steps can be taken to encourage faculty to implement IL training in the classroom. We find that faculty believe that there is a problem but many are not explicitly teaching IL as part of their courses. We also find that faculty members who have received IL training are far more likely to include it in their courses. This leads us to suggest that IL training should be provided at the faculty level, which will have positive downstream effects on the IL training that students receive. We also contend that IL deserves a more prominent place in the university curriculum.
This book provides an in-depth analysis of the EU Social Inclusion Process and explores the challenges ahead at local, regional, national and EU levels.
Background: Poorly-defined cases (PDCs) of focal epilepsy are cases with no/subtle MRI abnormalities or have abnormalities extending beyond the lesion visible on MRI. Here, we evaluated the utility of Arterial Spin Labeling (ASL) MRI perfusion in PDCs of pediatric focal epilepsy. Methods: ASL MRI was obtained in 25 consecutive children presenting with poorly-defined focal epilepsy (20 MRI- positive, 5 MRI-negative). Qualitative visual inspection and quantitative analysis with asymmetry and Z-score maps were used to detect perfusion abnormalities. ASL results were compared to the hypothesized epileptogenic zone (EZ) derived from other clinical/imaging data and the resection zone in patients with Engel I/II outcome and >18 month follow-up. Results: Qualitative analysis revealed perfusion abnormalities in 17/25 total cases (68%), 17/20 MRI-positive cases (85%) and none of the MRI-negative cases. Quantitative analysis confirmed all cases with abnormalities on qualitative analysis, but found 1 additional true-positive and 4 false-positives. Concordance with the surgically-proven EZ was found in 10/11 cases qualitatively (sensitivity=91%, specificity=50%), and 11/11 cases quantitatively (sensitivity=100%, specificity=23%). Conclusions: ASL perfusion may support the hypothesized EZ, but has limited localization benefit in MRI-negative cases. Nevertheless, owing to its non-invasiveness and ease of acquisition, ASL could be a useful addition to the pre-surgical MRI evaluation of pediatric focal epilepsy.
Background: Focal cortical dysplasias (FCDs) are congenital structural abnormalities of the brain, and represent the most common cause of medication-resistant focal epilepsy in children and adults. Recent studies have shown that somatic mutations (i.e. mutations arising in the embryo) in mTOR pathway genes underlie some FCD cases. Specific therapies targeting the mTOR pathway are available. However, testing for somatic mTOR pathway mutations in FCD tissue is not performed on a clinical basis, and the contribution of such mutations to the pathogenesis of FCD remains unknown. Aim: To investigate the feasibility of screening for somatic mutations in resected FCD tissue and determine the proportion and spatial distribution of FCDs which are due to low-level somatic mTOR pathway mutations. Methods: We performed ultra-deep sequencing of 13 mTOR pathway genes using a custom HaloPlexHS target enrichment kit (Agilent Technologies) in 16 resected histologically-confirmed FCD specimens. Results: We identified causal variants in 62.5% (10/16) of patients at an alternate allele frequency of 0.75–33.7%. The spatial mutation frequency correlated with the FCD lesion’s size and severity. Conclusions: Screening FCD tissue using a custom panel results in a high yield, and should be considered clinically given the important potential implications regarding surgical resection, medical management and genetic counselling.
Introduction: Emergency department (ED) staff carry a high risk for the burnout syndrome of increased emotional exhaustion, depersonalization and decreased personal accomplishment. Previous research has shown that task-oriented coping skills were associated with reduced levels of burnout compared to emotion-oriented coping. ED staff at one hospital participated in an intervention to teach task-oriented coping skills. We hypothesized that the intervention would alter staff coping behaviors and ultimately reduce burnout. Methods: ED physicians, nurses and support staff at two regional hospitals were surveyed using the Maslach Burnout Inventory (MBI) and the Coping Inventory for Stressful Situations (CISS). Surveys were performed before and after the implementation of communication and conflict resolution skills training at the intervention facility (I) consisting of a one-day course and a small group refresher 6 to 15 months later. Descriptive statistics and multivariate analysis assessed differences in staff burnout and coping styles compared to the control facility (C) and over time. Results: 85/143 (I) and 42/110 (C) ED staff responded to the initial survey. Post intervention 46 (I) and 23(C) responded. During the two year study period there was no statistically significant difference in CISS or MBI scores between hospitals (CISS: (Pillai's trace = .02, F(3,63) = .47, p = .71, partial η2 = .02); MBI: (Pillai's trace = .01, F(3,63) = .11, p = .95, partial η2 = .01)) or between pre- and post-intervention groups (CISS: (Pillai's trace = .01, F(3,63) = .22, p = .88, partial η2 = .01); MBI: (Pillai's trace = .09, F(3,63) = 2.15, p = .10, partial η2 = .01)). Conclusion: We were not able to measure improvement in staff coping or burnout in ED staff receiving communication skills intervention over a two year period. Burnout is a multifactorial problem and environmental rather than individual factors may be more important to address. Alternatively, to demonstrate a measurable effect on burnout may require more robust or inclusive interventions.
Introduction: Point-of-Care Ultrasound (PoCUS) is being increasingly utilized during cardiac arrests for prognosis. Following the publication of recent studies, the goal of this study was to systematically review and analyze the literature to evaluate the accuracy of PoCUS in predicting return of spontaneous circulation (ROSC), survival to hospital admission (SHA), and survival to hospital discharge (SHD) in adult patients with non-traumatic, non- shockable out- of-hospital or emergency department cardiac arrest. Methods: A systematic review and meta-analysis was completed. A search of Medline, EMBASE, Cochrane, CINAHL, ClinicalTrials.gov and the World Health Organization Registry was completed from 1974 until August 24th 2018. Adult randomized controlled trials and observational studies were included. The QUADAS-2 tool was applied by two independent reviewers. Data analysis was completed according to PRISMA guidelines and with a random effects model for the meta-analysis. Heterogeneity was assessed using I-squared statistics. Results: Ten studies (1,485 participants) were included. Cardiac activity on PoCUS had a pooled sensitivity of 59.9% (95% confidence interval 36.5%-79.4%) and specificity of 91.5% (80.8%-96.5%) for ROSC; 74.7% (58.3%-86.2%) and 80.5% (71.7%-87.4%) for SHA; and 69.4% (45.5%-86.0%) and 74.6% (59.8%-85.3%) for SHD. The sensitivity of cardiac activity on PoCUS for predicting ROSC was 24.7%(6.8%-59.4%) in the asystole subgroup compared with 77% (59.4%-88.5%) within the PEA subgroup. Cardiac activity on PoCUS, compared to an absence had an odd ratio of 15.9 (5.9-42.5) for ROSC, 9.8 (4.9-19.4) for SHA and 5.7 (2.1-15.6) for SHD. Positive likelihood ratio (LR) was 6.65 (3.16-14.0) and negative LR was 0.27 (0.12-0.61) for ROSC. Conclusion: Cardiac activity on PoCUS was associated with improved odds for ROSC, SHA, and SHD among adults with non-traumatic asystole and PEA. We report lower sensitivity and higher negative likelihood ratio, but with greater heterogeneity compared to previous systematic reviews. PoCUS may provide valuable information in the management of non-traumatic PEA or asystole, but should not be viewed as the sole predictor in determining outcomes in these patients.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.
Introduction: Point of care ultrasound has been reported to improve diagnosis in non-traumatic hypotensive ED patients. We compared diagnostic performance of physicians with and without PoCUS in undifferentiated hypotensive patients as part of an international prospective randomized controlled study. The primary outcome was diagnostic performance of PoCUS for cardiogenic vs. non-cardiogenic shock. Methods: SHoC-ED recruited hypotensive patients (SBP < 100 mmHg or shock index > 1) in 6 centres in Canada and South Africa. We describe previously unreported secondary outcomes relating to diagnostic accuracy. Patients were randomized to standard clinical assessment (No PoCUS) or PoCUS groups. PoCUS-trained physicians performed scans after initial assessment. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses including shock category were recorded at 0 and 60 minutes. Final diagnosis was determined by independent blinded chart review. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: 273 patients were enrolled with follow-up for primary outcome completed for 270. Baseline demographics and perceived category of shock were similar between groups. 11% of patients were determined to have cardiogenic shock. PoCUS had a sensitivity of 80.0% (95% CI 54.8 to 93.0%), specificity 95.5% (90.0 to 98.1%), LR+ve 17.9 (7.34 to 43.8), LR-ve 0.21 (0.08 to 0.58), Diagnostic OR 85.6 (18.2 to 403.6) and accuracy 93.7% (88.0 to 97.2%) for cardiogenic shock. Standard assessment without PoCUS had a sensitivity of 91.7% (64.6 to 98.5%), specificity 93.8% (87.8 to 97.0%), LR+ve 14.8 (7.1 to 30.9), LR- of 0.09 (0.01 to 0.58), Diagnostic OR 166.6 (18.7 to 1481) and accuracy of 93.6% (87.8 to 97.2%). There was no significant difference in sensitivity (-11.7% (-37.8 to 18.3%)) or specificity (1.73% (-4.67 to 8.29%)). Diagnostic performance was also similar between other shock subcategories. Conclusion: As reported in other studies, PoCUS based assessment performed well diagnostically in undifferentiated hypotensive patients, especially as a rule-in test. However performance was similar to standard (non-PoCUS) assessment, which was excellent in this study.
Susceptibility of a system to colonization by a weed is in part a function of environmental resource availability. Doveweed [Murdannia nudiflora (L.) Brenan] can establish in a variety of environments; however, it is found mostly in wet or low-lying areas with reduced interspecies competition. Four studies evaluated the effect of mowing height, interspecies competition, and nitrogen, light, and soil moisture availability on M. nudiflora establishment and growth. A field study evaluated the effect of mowing height on M. nudiflora establishment. In comparison with unmowed plots, mowing at 2 and 4 cm reduced spread 46% and 30%, respectively, at 9 wk after planting. Effect of mowing height and nitrogen fertilization on ‘Tifway’ bermudagrass (Cynodon dactylon Burtt-Davy×C. transvaalensis L. Pers.) and M. nudiflora interspecies competition was evaluated in a greenhouse trial. Murdannia nudiflora coverage was 62% greater in flats maintained at 2.6 cm than flats maintained at 1.3 cm. Supplemental application of 49 kg N ha−1 mo−1 increased M. nudiflora coverage 75% in comparison with 24.5 kg N ha−1 mo−1. A difference in M. nudiflora coverage could not be detected between flats receiving 0 and 24.5 kg N ha−1 mo−1, suggesting moderate nitrogen fertilization does not encourage M. nudiflora colonization. Effect of light availability on M. nudiflora growth and development was evaluated in a greenhouse study. Growth in a 30%, 50%, or 70% reduced light environment (RLE) did not affect shoot growth on a dry weight basis in comparison with plants grown under full irradiance; however, internode length was 28% longer in a 30% RLE and 39% longer in a 50% and 70% RLE. Effect of soil moisture on M. nudiflora growth and development was evaluated in a greenhouse study. Plants maintained at 50%, 75%, and 100% field capacity (FC) increased biomass>200% compared with plants maintained at 12.5% or 25% FC.
The degree of transport and retention of 14CH4 in soil is being investigated in a series of laboratory experiments in preparation for field scale trials at the University of Nottingham. The experimental programme focusses on the behaviour and fate of 14CH4 injected into subsoil and its subsequent incorporation into vegetation under field conditions. Due to restrictions on the use of radioactive tracers in the field, 13CH4 is being used as a surrogate gas which can be handled conveniently in the laboratory and field and which can be measured with high precision using gas chromatography with isotope ratio mass spectrometry. The laboratory data indicate significant differences between the diffusion and oxidation rates of 13CH4 in re-packed and undisturbed soil columns, with both rates appearing to be significantly lower in undisturbed soils. Data from both laboratory and field experiments will be used to inform the development of a model of 14CH4 migration and its fate in the biosphere above a geological disposal facility.
Introduction: Situational awareness (SA) is essential for maintenance of scene safety and effective resource allocation in mass casualty incidents (MCI). Unmanned aerial vehicles (UAV) can potentially enhance SA with real-time visual feedback during chaotic and evolving or inaccessible events. The purpose of this study was to test the ability of paramedics to use UAV video from a simulated MCI to identify scene hazards, initiate patient triage, and designate key operational locations. Methods: A simulated MCI, including fifteen patients of varying acuity (blast type injuries), plus four hazards, was created on a college campus. The scene was surveyed by UAV capturing video of all patients, hazards, surrounding buildings and streets. Attendees of a provincial paramedic meeting were invited to participate. Participants received a lecture on SALT Triage and the principles of MCI scene management. Next, they watched the UAV video footage. Participants were directed to sort patients according to SALT Triage step one, identify injuries, and localize the patients within the campus. Additionally, they were asked to select a start point for SALT Triage step two, identify and locate hazards, and designate locations for an Incident Command Post, Treatment Area, Transport Area and Access/Egress routes. Summary statistics were performed and a linear regression model was used to assess relationships between demographic variables and both patient triage and localization. Results: Ninety-six individuals participated. Mean age was 35 years (SD 11), 46% (44) were female, and 49% (47) were Primary Care Paramedics. Most participants (80 (84%)) correctly sorted at least 12 of 15 patients. Increased age was associated with decreased triage accuracy [-0.04(-0.07,-0.01);p=0.031]. Fifty-two (54%) were able to localize 12 or more of the 15 patients to a 27x 20m grid area. Advanced paramedic certification, and local residency were associated with improved patient localization [2.47(0.23,4.72);p=0.031], [-3.36(-5.61,-1.1);p=0.004]. The majority of participants (78 (81%)) chose an acceptable location to start SALT triage step two and 84% (80) identified at least three of four hazards. Approximately half (53 (55%)) of participants designated four or more of five key operational areas in appropriate locations. Conclusion: This study demonstrates the potential of UAV technology to remotely provide emergency responders with SA in a MCI. Additional research is required to further investigate optimal strategies to deploy UAVs in this context.