We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The World Health Organization established the Emergency Medical Team (EMT) initiative in 2013 to standardize disaster response, emphasizing robust education and training programs. The Canadian Medical Assistance Teams (CMAT), a volunteer-run NGO with over 1,000 members, struggles with logistical and financial constraints for in-person training.
Objectives:
This study evaluates the effectiveness of virtually delivered TEAMS 3.0 tabletop modules for training CMAT’s volunteers, hypothesizing that virtual training is effective and comparable to in-person training. Adapt TEAMS 3.0 tabletop exercises into a virtual format and assess their effectiveness. Compare the effectiveness of virtual and in-person training.
Method/Description:
A quasi-experimental design with non-randomized groups was used. CMAT members were assigned to in-person or virtual training based on availability. Pre- and post-training surveys assessed self-efficacy, teamwork, and training quality. Statistical analysis using SPSS employed non-parametric tests to compare pre- and post-training scores and between-group differences. Qualitative feedback was collected via a post-training anonymous form.
Results/Outcomes:
Four TEAMS 3.0 exercises were adapted for virtual delivery using Google Meet and Google collaborative tools. Among 26 participants (10 in-person, 16 virtual), both formats showed no significant changes in self-efficacy or teamwork scores from pre- to post-training. In-person training received significantly higher quality ratings from trainees compared to virtual training (p=0.026). Trainers’ quality ratings also favored in-person training but were not statistically significant (p=0.091).
Conclusion:
Virtual TEAMS 3.0 exercises yielded similar self-efficacy and teamwork results as in-person training, though in-person sessions were rated higher quality. This supports virtual training as a scalable, cost-effective alternative, though further research with larger samples is needed.
In acute ischemic stroke, a longer time from onset to endovascular treatment (EVT) is associated with worse clinical outcome. We investigated the association of clinical outcome with time from last known well to arrival at the EVT hospital and time from hospital arrival to arterial access for anterior circulation large vessel occlusion patients treated > 6 hours from last known well.
Methods:
Retrospective analysis of the prospective, multicenter cohort study ESCAPE-LATE. Patients presenting > 6 hours after last known well with anterior circulation large vessel occlusion undergoing EVT were included. The primary outcome was the modified Rankin Scale (mRS) score at 90 days. Secondary outcomes were good (mRS 0–2) and poor clinical outcomes (mRS 5–6) at 90 days, as well as the National Institutes of Health Stroke Scale at 24 hours. Associations of time intervals with outcomes were assessed with univariable and multivariable logistic regression.
Results:
Two hundred patients were included in the analysis, of whom 85 (43%) were female. 90-day mRS was available for 141 patients. Of the 150 patients, 135 (90%) had moderate-to-good collaterals, and the median Alberta Stroke Program Early CT Score (ASPECTS) was 8 (IQR = 7–10). No association between ordinal mRS and time from last known well to arrival at the EVT hospital (odds ratio [OR] = 1.01, 95% CI = 1.00–1.02) or time from hospital arrival to arterial access (OR = -0.01, 95% CI = -0.02–0.00) was seen in adjusted regression models.
Conclusion:
No relationship was observed between pre-hospital or in-hospital workflow times and clinical outcomes. Baseline ASPECTS and collateral status were favorable in the majority of patients, suggesting that physicians may have chosen to predominantly treat slow progressors in the late time window, in whom prolonged workflow times have less impact on outcomes.
Background: Cerebral venous thrombosis (CVT)most commonly affects younger women. Diagnosis may be delayed due to its distinct presentation and demographic profile compared to other stroke types. Methods: We examined delays to diagnosis of CVT in the SECRET randomized trial and TOP-SECRET parallel registry. Adults diagnosed with symptomatic CVT within <14 days were included. We examined time to diagnosis and number of health care encounters prior to diagnosis and associations with demographics, clinical and radiologic features and functional and patient-reported outcomes (PROMS) at days 180&365. Results: Of 103 participants, 68.9% were female; median age was 45 (IQR 31.0-61.0). Median time from symptom onset to diagnosis was 4 (1-8) days. Diagnosis on first presentation to medical attention was made in 60.2%. The difference in time to diagnosis for single versus multiple presentations was on the order of days (3[1-7] vs. 5[2-11.75], p=0.16). Women were likelier to have multiple presentations (OR 2.53; 95% CI1.00-6.39; p=0.05) and longer median times to diagnosis (5[2-8]days vs. 2[1-4.5] days; p=0.005). However, this was not associated with absolute or change in functional, or any patient reported, outcome measures (PROMs) at days 180&365. Conclusions: Diagnosis of CVT was commonly delayed; women were likelier to have multiple presentations. We found no association between delayed diagnosis and outcomes.
To assess cost-effectiveness of late time-window endovascular treatment (EVT) in a clinical trial setting and a “real-world” setting.
Methods:
Data are from the randomized ESCAPE trial and a prospective cohort study (ESCAPE-LATE). Anterior circulation large vessel occlusion patients presenting > 6 hours from last-known-well were included, whereby collateral status was an inclusion criterion for ESCAPE but not ESCAPE-LATE. A Markov state transition model was built to estimate lifetime costs and quality-adjusted life-years (QALYs) for EVT in addition to best medical care vs. best medical care only in a clinical trial setting (comparing ESCAPE-EVT to ESCAPE control arm patients) and a “real-world” setting (comparing ESCAPE-LATE to ESCAPE control arm patients). We performed an unadjusted analysis, using 90-day modified Rankin Scale(mRS) scores as model input and analysis adjusted for baseline factors. Acceptability of EVT was calculated using upper/lower willingness-to-pay thresholds of 100,000 USD/50,000 USD/QALY.
Results:
Two-hundred and forty-nine patients were included (ESCAPE-LATE:n = 200, ESCAPE EVT-arm:n = 29, ESCAPE control-arm:n = 20). Late EVT in addition to best medical care was cost effective in the unadjusted analysis both in the clinical trial and real-world setting, with acceptability 96.6%–99.0%. After adjusting for differences in baseline variables between the groups, late EVT was marginally cost effective in the clinical trial setting (acceptability:49.9%–61.6%), but not the “real-world” setting (acceptability:32.9%–42.6%).
Conclusion:
EVT for LVO-patients presenting beyond 6 hours was cost effective in the clinical trial setting and “real-world” setting, although this was largely related to baseline patient differences favoring the “real-world” EVT group. After adjusting for these, EVT benefit was reduced in the trial setting, and absent in the real-world setting.
Recent years have seen a flourishing of everyday experimentations with the category of religion: the “spiritual but not religious,” “religionless” Christians, and many more. Why is there such proliferation of popular experimentation with—and often distancing from—the category of religion? This article explores two such cases of experimentation, a religion-disavowing evangelical Christian brotherhood in Mexico and a Masonic lodge in Switzerland, and shows how, in these two cases, disavowing religion is in part a response to problems associated with a founding principle of liberalism, the separation of private conscience from public citizenship. Subjects of liberal separation are vulnerable to feelings of cloistered conscience and hollow citizenship, problems that are inherent to liberal separation, as evidenced by Freemasonry’s age-old experimentations. These problems are also, however, exacerbated by dwindling popular faith in the institutions of religion and liberal democracy, as evidenced by contemporary evangelical trends of which the Christian brotherhood is exemplary. Such experimentations can be distinguished between those that collapse conscience and citizenship and those that defend the separation while still looking for indirect connections. This contrast is also highlighted by the comparison of religion-disavowing evangelical Christians and Freemasonry.
The pace and trajectory of global and local environmental changes are jeopardizing our health in numerous ways, among them exacerbating the risk of disease emergence and spread in both the community and the healthcare setting via healthcare-associated infections (HAIs). Factors such as climate change, widespread land alteration, and biodiversity loss underlie changing human–animal–environment interactions that drive disease vectors, pathogen spillover, and cross-species transmission of zoonoses. Climate change–associated extreme weather events also threaten critical healthcare infrastructure, infection prevention and control (IPC) efforts, and treatment continuity, adding to stress to strained systems and creating new areas of vulnerability. These dynamics increase the likelihood of developing antimicrobial resistance (AMR), vulnerability to HAIs, and high-consequence hospital-based disease transmission. Using a One Health approach to both human and animal health systems, we can become climate smart by re-examining impacts on and relationships with the environment. We can then work collaboratively to reduce and respond to the growing threat and burden of infectious diseases.
Depression and borderline personality disorder (BPD) are both thought to be accompanied by alterations in the subjective experience of environmental rewards. We evaluated responses in women to sweet, bitter and neutral tastes (juice, quinine and water): 29 with depression, 17 with BPD and 27 healthy controls. The BPD group gave lower pleasantness and higher disgust ratings for quinine and juice compared with the control group; the depression group did not differ significantly from the control group. Juice disgust ratings were related to self-disgust in BPD, suggesting close links between abnormal sensory processing and self-identity in BPD.
Crop-raiding by primates and bushpigs Potamochoerus porcus is a major cause of human–wildlife conflict around Budongo Forest Reserve, Uganda. In 2006–2007 a project was initiated, with farmer participation, to investigate the efficacy of on-farm techniques to reduce crop-raiding, including guarding and early-warning techniques, fences, plant barriers, trenches, lights and nets. Here, farmers' perceptions of the effectiveness and sustainability of these deterrents were evaluated using semi-structured interviews and direct observations. Factors important to farmers in effective, sustainable and locally appropriate crop-raiding mitigation are that deterrents be cost-effective, easily manipulated, improve guarding efficiency and require minimal labour inputs. Farmers reported paid guards, guard dogs, wire fences, lights and bells/alarms as most effective. This differs from observations that farmers independently maintained certain deterrents that they presumably considered valuable, namely wire fences, guard dogs, bells/alarms, trenches, lights and nets. This evaluation demonstrates the importance of farmers' participation and perceptions in the viability and uptake of crop-raiding deterrents, and the importance of assessing conflict mitigation trials over the long term.
Biodiversity is recognised to be of global importance, yet species and habitats continue to be under increasing pressure from human-induced influences. Environmental concerns are high on the political agenda, driving increased legislation to protect the natural environment. The starting point for much of this legislation is the requirement for a comprehensive biodiversity audit. For those needing to undertake such audits, this Handbook, first published in 2005, provides standard procedures which will enable practitioners to better monitor the condition of the biodiversity resource, resulting in improved data upon which to base future policy decisions and actions. Organised in three parts, the Handbook first addresses planning, covering method selection, experimental design, sampling strategy, and data analysis and evaluation. The second part describes survey, evaluation and monitoring methods for a broad range of habitats. Part three considers species and provides information on general methods before addressing specific methods of survey and monitoring for the major taxonomic groups.
The flux through the de novo fatty acid synthesis pathway was estimated in lines of mice which differed substantially in fat content following 26 generations of selection at 10 weeks of age. Previous estimates of lipogenic enzyme activities had indicated an increase in the capacity for lipogenesis in the Fat compared to the Lean line. Therefore the in vivo flux in lipogenesis was measured in both liver and gonadal fat pad (GFP) tissues of males at 5 and 10 weeks of age, using the rat of incorporation of 3H from 3H2O and 14C from acetate and citra te into total lipids. AT both ages and in both tissues the Fat line had a higher flux, about 20% increase in the liver and up to three-fold increase (range 1·2- to 3·4-fold) in the GFP. We conclude that direct selection for fatness in mice has resulted in metabolic changes in the ratio of de novo fatty acid synthesis, and that the changes are largely detectable before 10 weeks, the age of selection.
Estimates of the activities (Vmax) of four enzymes that generate the coenzyme NADPH, an absolute requirement for tissue fatty-acid synthesis, and of the concentration of NADP plus NADPH were made in lines of mice differing in fat content. These lines had been selected from the same base population for 20 generations, and 3 high, 3 low replicates and 1 unselected control were used. Analyses were performed on liver and gonadal fat pad (GFP) of males at 5 and 10 weeks of age. In both the liver and the GFP, measurable activities of the four enzymes: glucose-6-phosphate dehydrogenase (G6PDH), 6-phosphogluconate dehydrogenase (6PGDH), isocitrate dehydrogenase (IDH) and malic enzyme (ME) expressed per mg soluble protein were, with minor exceptions, higher in the Fat (F) than in the Lean (L) lines at both ages; the highest ratio being 2–2 for ME in the GFP. The relationships between these measurable activities (Vmax) and in vivo lipogenesis are not however known. When expressed per gram tissue, the ratios for F to L in the GFP were less than 1 in most cases, presumably because of the very different adipocyte numbers and/or sizes between the lines. There were no significant differences between the lines in the concentration of NADP plus NADPH per gram tissue in liver or GFP, suggesting that F lines converted NADP to NADPH faster than L lines. It is predicted that selection on the enzyme activities would be less efficient than direct selection at changing fat content.
Energy balance is the difference between energy consumed and total energy expended. Over a given period of time it expresses how much the body stores of fat, carbohydrate and protein will change. For the critically-ill patient, who characteristically exhibits raised energy expenditure and proteolysis of skeletal muscle, energy balance information is valuable because underfeeding or overfeeding may compromise recovery. However, there are formidable difficulties in measuring energy balance in these patients. While energy intake can be accurately recorded in the intensive care setting, the measurement of total energy expenditure is problematic. Widely used approaches, such as direct calorimetry or doubly-labelled water, are not applicable to the critically ill patient. Energy balance was determined over periods of 5–10 d in patients in intensive care by measuring changes in the fat, protein and carbohydrate stores of the body. Changes in total body fat were positively correlated with energy balance over the 5 d study periods in patients with severe sepsis (n24, r 0.56, P=0.004) or major trauma (n 24, r 0.70, P<.0001). Fat oxidation occurred in patients whose energy intake was insufficient to achieve energy balance. Changes in body protein were independent of energy balance. These results are consistent with those of other researchers who have estimated total energy requirements from measurements of O2 consumption and CO2 production. In critically-ill patients achievement of positive non-protein energy balance or total energy balance does not prevent negative N balance. Nutritional therapy for these patients may in the future focus on glycaemic control with insulin and specialised supplements rather than on energy balance per se.
Substance use is implicated in the cause and course of psychosis.
Aims
To characterise substance and alcohol use in an epidemiologically representative treatment sample of people experiencing a first psychotic episode in south Cambridgeshire.
Method
Current and lifetime substance use was recorded for 123 consecutive referrals to a specialist early intervention service. Substance use was compared with general population prevalence estimates from the British Crime Survey.
Results
Substance use among people with first-episode psychosis was twice that of the general population and was more common in men than women. Cannabis abuse was reported in 51% of patients (n=62) and alcohol abuse in 43% (n=53). More than half (n=68, 55%) had used Class A drugs, and 38% (n=43) reported polysubstance abuse. Age at first use of cannabis, cocaine, ecstasy and amphetamine was significantly associated with age at first psychotic symptom.
Conclusions
Substance misuse is present in the majority of people with first-episode psychosis and has major implications for management. The association between age at first substance use and first psychotic symptoms has public health implications.
Various techniques (e.g. quadrats and transects) have been used to mark permanent plots; these are described briefly below. A general point to consider is that the more techniques used, the quicker it will normally be to find plots again.
MAPPING
Measurements to nearby features have been widely used to map locations of plots and are relatively foolproof, provided that mapping is accurate (use a backsighting compass for bearings and measure distances correctly) and that the features chosen are fixed and permanent. This is particularly important for long-term monitoring studies; features such as fence posts may be damaged or lost over time. However, the method is often difficult to apply in large homogeneous habitats, such as grasslands, where obvious permanent features are lacking. It is also time-consuming when a large number of plots need to be relocated.
MARKER POSTS
Wooden or metal posts are widely used and can be quick to re-find in relatively small sites. However, small markers can be hidden by vegetation. Large markers can cause significant damage to habitats, tend to be unsightly and attract the attention of people. Animals too may scratch against large markers, thereby causing disproportionate disturbance to vegetation, resulting in bias in the sampling. Unless markers are strong and well secured they may be broken by livestock or removed by vandals, etc. Posts may also be lost over time through rotting or corrosion and even frost heave.
PAINT
Paint has been used to mark plots, especially where rocks, walls or posts are available nearby.
The size of a quadrat affects the measured values of frequency, density, and cover, etc. (see Figure A4.1 below). It is therefore important to decide in advance which values are to be measured. Experience has shown that different vegetation types and different measurement types require different quadrat sizes. In Part II, quadrat methods for habitat monitoring are described in Sections 6.4.2 (frame quadrats for cover and density estimates); 6.4.3 (random mini-quadrats for frequency estimates); 6.4.4 (FIBS analysis); and 6.4.5 (point quadrats). Quadrat size is also considered in the section on NVC mapping (Section 6.1.6). In Part III, the chapters on species groups and Chapter 10 also contain discussions of quadrat methods, where appropriate to the species group concerned.
This appendix deals with the selection of the appropriate quadrat size. Methods for calculating the number of quadrats required are given in Part I, Section 2.3.4. Frequency estimates are given the most attention, because quadrat size affects frequency measures more than others (see Figure A4.1). However, the lists of optimum quadrat size for different vegetation types can generally be applied to all quadrat sampling methods (with the obvious exception of point quadrats).
Techniques for determining optimum quadrat size for frequency measures are subjective, and a quadrat of any size will sample some species more adequately than others. The quadrat size chosen will therefore depend upon the type of vegetation being sampled. The use of random mini-quadrats for estimating frequency is described in Part II, Section 6.4.3.