We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
University students often face high levels of stress and sleep disturbances due to their academic demands and lifestyle factors(1). Ashwagandha (Withania somnifera), an adaptogenic herb, has shown the potential to mitigate stress and improve cognitive function(2). However, limited research has examined its effects on these variables in university students. This study aimed to determine the effects of ashwagandha supplementation on sleep quality, mood, and cognitive function in university students.
A randomized, double-blind, placebo-controlled crossover study was used. Nine university students (5 males, 4 females; Age: 21±1 years; BMI: 25±2.5 kg/m2) were randomly assigned to receive 500 mg of standardized ashwagandha root extract capsules for 7 days or a placebo (encapsulated cornstarch) with a 7-day washout between treatments. Sleep was measured during the 7-day supplementation period using the Loughborough Daily Sleep Diary. Postsupplementation mood and cognitive function were measured by the Profile of Mood States (POMS) scale(3) and computerised Stroop, and Deary-Liewald simple and choice reaction tasks(4). Paired sample t-tests were used to determine differences between the ashwagandha and placebo conditions with calculated effect sizes (Cohen’s d).
Participants reported lower confusion indicator on the POMS following ashwagandha compared to the placebo (mean±SD: 4.8±2.0 vs 7.6±3.1 arbitrary units; P=0.03; d = −0.92). No other differences were found for any other mood indicators, sleep, or cognitive function parameters (P > 0.05).
These data suggest that ashwagandha may improve feelings of confusion in university students but further studies with larger sample sizes are needed to verify these findings and elucidate the underlying mechanisms.
Cyber Operational Risk: Cyber risk is routinely cited as one of the most important sources of operational risks facing organisations today, in various publications and surveys. Further, in recent years, cyber risk has entered the public conscience through highly publicised events involving affected UK organisations such as TalkTalk, Morrisons and the NHS. Regulators and legislators are increasing their focus on this topic, with General Data Protection Regulation (“GDPR”) a notable example of this. Risk actuaries and other risk management professionals at insurance companies therefore need to have a robust assessment of the potential losses stemming from cyber risk that their organisations may face. They should be able to do this as part of an overall risk management framework and be able to demonstrate this to stakeholders such as regulators and shareholders. Given that cyber risks are still very much new territory for insurers and there is no commonly accepted practice, this paper describes a proposed framework in which to perform such an assessment. As part of this, we leverage two existing frameworks – the Chief Risk Officer (“CRO”) Forum cyber incident taxonomy, and the National Institute of Standards and Technology (“NIST”) framework – to describe the taxonomy of a cyber incident, and the relevant cyber security and risk mitigation items for the incident in question, respectively.Summary of Results: Three detailed scenarios have been investigated by the working party:
∙ Employee leaks data at a general (non-life) insurer: Internal attack through social engineering, causing large compensation costs and regulatory fines, driving a 1 in 200 loss of £210.5m (c. 2% of annual revenue).
∙ Cyber extortion at a life insurer: External attack through social engineering, causing large business interruption and reputational damage, driving a 1 in 200 loss of £179.5m (c. 6% of annual revenue).
∙ Motor insurer telematics device hack: External attack through software vulnerabilities, causing large remediation / device replacement costs, driving a 1 in 200 loss of £70.0m (c. 18% of annual revenue).
Limitations: The following sets out key limitations of the work set out in this paper:
∙ While the presented scenarios are deemed material at this point in time, the threat landscape moves fast and could render specific narratives and calibrations obsolete within a short-time frame.
∙ There is a lack of historical data to base certain scenarios on and therefore a high level of subjectivity is used to calibrate them.
∙ No attempt has been made to make an allowance for seasonality of renewals (a cyber event coinciding with peak renewal season could exacerbate cost impacts)
∙ No consideration has been given to the impact of the event on the share price of the company.
∙ Correlation with other risk types has not been explicitly considered.
Conclusions: Cyber risk is a very real threat and should not be ignored or treated lightly in operational risk frameworks, as it has the potential to threaten the ongoing viability of an organisation. Risk managers and capital actuaries should be aware of the various sources of cyber risk and the potential impacts to ensure that the business is sufficiently prepared for such an event. When it comes to quantifying the impact of cyber risk on the operations of an insurer there are significant challenges. Not least that the threat landscape is ever changing and there is a lack of historical experience to base assumptions off. Given this uncertainty, this paper sets out a framework upon which readers can bring consistency to the way scenarios are developed over time. It provides a common taxonomy to ensure that key aspects of cyber risk are considered and sets out examples of how to implement the framework. It is critical that insurers endeavour to understand cyber risk better and look to refine assumptions over time as new information is received. In addition to ensuring that sufficient capital is being held for key operational risks, the investment in understanding cyber risk now will help to educate senior management and could have benefits through influencing internal cyber security capabilities.
Programmes for the geological disposal of radioactive wastes are by nature extremely complex. A structured approach for making and documenting varied kinds of decisions is required to support programme design and implementation. At each programme stage, the decision-making process must be able to identify and justify key priorities for work, to reduce uncertainties.
To support structured decision-making evidence support logic (ESL) has been developed and applied to varied complex projects, nationally and internationally, in several industries. Evidence support logic involves breaking down a hypothesis that informs a decision into a hierarchical 'decision tree'. Examples of hypotheses are 'the geology associated with site x will provide sufficient disposal capacity', 'container x will contain waste form y for z years' and 'the engineered barrier system will provide the required safety functions'. Independent evaluations of confidence 'for' and 'against' bottom-level hypotheses allow the level of remaining uncertainty (or conflict) to be recognized explicitly, and the overall confidence (and uncertainty) relevant to the overall decision, and key sensitivities, to be represented clearly and succinctly.
Thus ESL can help (1) break down decisions into a manageable and logical structure, assisting clear presentation; (2) identify key uncertainties and sensitivities to inform prioritization; and (3) test whether the outcomes of specific studies have improved confidence.
Introduction: Resuscitation is a dynamic, complex and time-sensitive field which encompasses management of both critically-ill patients as well as large multidisciplinary teams. Expertise in this area has not been adequately defined, and to date, no research has directly examined the decision-making and cognitive processes involved. The evolving paradigm of competency-based medical education (CBME) makes better defining expertise in this field of critical importance to aid in the development of both educational and assessment methods. The technique of cognitive task analysis (CTA) has been used in a variety of fields to explicate the cognitive underpinnings of experts. Experts, however, often have limited insight and incomplete recall of their decision-making processes. We hypothesized that the use of eye-tracking, which provides the combination of first-person video as well as an overlying gaze indicator, could be used to enhance CTA to better understand the defining characteristics of experts in resuscitation. Methods: Over an 18-month period a sample of 11 traumatic resuscitations were obtained, each led by one of four pre-selected expert physicians outfitted with the Tobii Pro Eye-Tracking Glasses. After each resuscitation, the participant was debriefed using a cued-recall, think-aloud protocol while watching his or her corresponding eye-tracking video. A subsequent qualitative analysis of the resulting video and debrief transcript was performed using an ethnographic approach to establish emerging themes and behaviours of the expert physicians. Results: The expert participants demonstrated specific, common patterns in their cognitive processes. In particular, participants exhibited similar anticipatory and visual behaviours, dynamic communication strategies and the ability to distinguish between task-relevant and task-redundant information. All participants reported that this technique uncovered otherwise subconscious aspects of their cognition. Conclusion: The novel combination of eye-tracking technology to supplement the CTA of expert resuscitationists enriched our understanding of expertise in this field and yielded specific findings that can be applied to better develop and assess resuscitation skills.
Introduction: Crisis decision-making is an important responsibility of the resuscitation team leader but a difficult process to study. The purpose of this study was to evaluate visual and behavioural differences between team leaders with different objective performance scores using gaze-tracking technology. Methods: Twenty-eight emergency medicine residents in different stages of training completed four simulated resuscitation scenarios. Participants wore gaze-tracking glasses during each station. An outside expert blinded to participant training level assessed performances using a validated assessment tool for simulation scenarios. Several visual endpoints were measured, including time, frequency, order, and latency to observation of task-relevant and task-redundant items. Non-visual endpoints included behaviours such as summarizing, verbalizing concerns, and calling for definitive treatments, among others. Results: Preliminary findings suggest significant differences between high and low performers. High performers check vitals signs faster, and look at patients and vital signs more often than low performers. Low-performing leaders display a more fixed gaze when starting a scenario. Lastly, high performers summarize, verbalize concerns, predict and prepare for future steps, and call for definitive treatment more often than low performers. Conclusion: There are significant differences between high and low-performing resuscitation team leaders in terms of their visual and behavioural patterns. These differences identify potential focus points for competency evaluations, and may direct educational interventions that could facilitate more efficient development of expertise. The potential to study crisis decision-making behaviours and performances using the methods and metrics identified, both in simulated and real-world settings, is substantial.
The objective of this research was to explore how spirituality is currently understood and taught in New Zealand Medical Schools.
Methods:
A mixed methods study was carried out involving interviews (n = 14) and a survey (n = 73). The first stage of the study involved recorded semi-structured interviews of people involved in curriculum development from the Dunedin School of Medicine (n = 14); which then informed a cross-sectional self-reported electronic survey (n = 73).
Results:
The results indicate that spirituality is regarded by many involved in medical education in New Zealand as an important part of healthcare that may be taught in medical schools, but also that there is little consensus among this group as to what the topic is about.
Significance of results:
These findings provide a basis for further discussion about including spirituality in medical curricula, and in particular indicate a need to develop a shared understanding of what ‘spirituality’ means and how it can be taught appropriately. As a highly secular country, these New Zealand findings are significant for medical education in other secular Western countries. Addressing spirituality with patients has been shown to positively impact a range of health outcomes, but how spirituality is taught in medical schools is still developing across the globe.
A community outbreak of legionellosis occurred in Barrow-in-Furness, Cumbria, during July and August 2002. A descriptive study and active case-finding were instigated and all known wet cooling systems and other potential sources were investigated. Genotypic and phenotypic analysis, and amplified fragment length polymorphism of clinical human and environmental isolates confirmed the air-conditioning unit of a council-owned arts and leisure centre to be the source of infection. Subsequent sequence-based typing confirmed this link. One hundred and seventy-nine cases, including seven deaths [case fatality rate (CFR) 3·9%] were attributed to the outbreak. Timely recognition and management of the incident very likely led to the low CFR compared to other outbreaks. The outbreak highlights the responsibility associated with managing an aerosol-producing system, with the potential to expose and infect a large proportion of the local population and the consequent legal ramifications and human cost.
Edited by
Alex S. Evers, Washington University School of Medicine, St Louis,Mervyn Maze, University of California, San Francisco,Evan D. Kharasch, Washington University School of Medicine, St Louis
One hundred and six HIV-positive drug users were tested with a two-tone auditory evoked potential (AEP) task and a small battery of neuropsychological tests, to examine the relationship between the latency of the P300 component (P3) of the AEP, intellectual function, mood and drug use. Principal components analysis revealed a significant correlation between P3 latency and the first principal component (r = −0·43, P < 0·001). Varimax rotation generated three orthogonal components which we interpreted as intellectual performance, memory, and mood. Intellectual performance and self-reported mood were individually correlated with P3 latency, but memory was not (r = −0·36, P < 0·001; r = 0·23, P < 0·05; and r = −0·18, NS, respectively).
Subjects with symptomatic HIV disease had a higher correlation between P3 latency and intellectual performance than subjects with asymptomatic HIV disease and, among patients with symptomatic HIV disease, poorer memory was associated with a lower CD4 count. Opiate or benzodiazepine consumption did not correlate with poor intellectual performance, memory, or selfrated mood in our sample. These results indicate that there is a relationship between AEP latency and neuropsychological measures of intellectual function, and that it is influenced by subjective mood. Surprisingly, declared current drug use has no discriminable effect on these relationships.
Terminal complement component deficiency predisposes to meningococcal infection and is inherited in an autosomal co-dominant manner. An Irish family is described, in which 2 of 3 brothers had recurrent meningococcal infection. A novel screening assay was used to investigate for terminal complement deficiency and the 2 affected brothers were found to be completely deficient in the seventh component of complement (C7). Enzyme-linked immunosorbent assay for C7 revealed lower than normal levels in the remaining brother and parents. C7 M/N protein polymorphism allotyping, used to investigate the segregation of the C7 deficiency genes, showed that the apparently complement sufficient brother was heterozygous C7 deficient and a carrier of one of the deficiency genes. Complement screening should be carried out in any individual suffering recurrent meningococcal infection or infection with an uncommon meningococcal serogroup. Identification of complement deficient patients allows the implementation of strategies to prevent recurrent infection.
Amongst a collection of temperature sensitive (TS) mutants of Escherichia coli K-12, some have been found which can grow at the restrictive temperature (42 °C) if the osmotic pressure of the medium is raised by the addition of sodium chloride (1%) or sucrose (12·5%). These mutants are described as temperature sensitive osmotic remedial (TSOR) mutants. At the restrictive temperature they are not osmotically fragile, but do display decreased resistance to inhibitory agents such as deoxycholate, actino-mycin D and acridine orange; they also show release of the periplasmic enzyme ribonuclease. These results indicate a change in the cell's outer permeability barrier. The genes affected in six of the mutants have been located on the E. coli linkage map. The mutations, which occur at loci not previously described, have been named envM–envT to indicate their effect on the cell envelope.
Weaned lambs of mean weight 25 kg were offered a diet of mature oaten hay or hay supplemented with a pelleted mixture of oat grain and sunflower meal (2:1), at one of three rates, for 86 days. The effect of the supplement on the voluntary intake of hay was measured during the first 20 days when feed was offered twice daily (Expt 1), after which the effects on ruminal and post-ruminal digestion were investigated under continuous feeding conditions (Expt 2).
In Expt 1 the first increment of supplement increased the total intake of organic matter (OM) but increasing the supplement further, up to 510 g D.M., had no additional effect. The voluntary intake of oaten hay was not significantly reduced by the lowest rate of supplementation but at higher rates was depressed at a mean rate of 92 g/100 g supplement. Rates of change in fasted weight on the four treatments were —63, —5, 21 and 45 g/day, respectively.
In Expt 2, where the rates of hay intake were held at 85% of those achieved in Expt 1, the first increment of supplement increased the pool size of OM and cell wall components in the reticulorumen by about 50%. It also increased their outflow rates at the abomasum by 24% and 33%, respectively, but significantly decreased the fractional outflow rate and fractional digestion rate of cell wall components. Supplementation decreased the proportion of apparent OM digestion that occurred in the reticulo-rumen from 76% to 65%. The presence of supplement doubled the ammonia pool in the rumen and increased the abomasal flow on non-ammonia nitrogen (NAN) and microbial NAN by 70%. Estimates of the amounts of crude protein apparently digested in the intestines (DCPi) increased linearly with proportion of supplement in the diet. However, the apparent digestibility of the hay was decreased, rather than increased, by the supplement. Although higher rates of supplement did not significantly change the elevated rumen pools of OM and cell wall components, there was a consistent tendency for these to decrease.
The results are consistent with the view that the intake of unsupplemented hay was limited by its low nitrogen content and the intake of supplemented hay may have been limited primarily by the capacity of the reticulo-rumen, although other factors were increasingly involved at higher rates of supplementation.
The utilization by sheep of dietary N provided in high protein, high water content fresh herbages (Ruanui perennial ryegrass, Tama Westerwolds ryegrass, Pitau white clover, and Fakir giant sainfoin at two stages of maturity) was studied at two levels of intake (maintenance and 1·5 maintenance). Feed was provided by constant feeder.
Apparent digestibility of N was similar for all herbages (ca. 85%) except sainfoin which, particularly at a late stage of maturity, was lower (70–80%). A small loss of nitrogen across the stomachs occurred with clover (1–3 g/day) and Tama ryegrass at the higher feeding level (2 g/day), but no loss was observed with the other diets. The apparent digestibility of N and of non-ammonia N (NAN) in the intestines was lower for sainfoin, and estimated true digestibility was also lower. Amino acid N contributed less to the NAN reaching the duodenum on the sainfoin diets than on the grass and clover diets.
N retention was negative at the lower level of feeding for ryegrass and clover diets. It was greatest for the sainfoin diets at similar N intakes, so that efficiency of retention of apparently digested N was also greatest for sainfoin.
The size of the urea pool, the plasma urea concentration and the urea irreversible loss, using [14C]urea, did not differ significantly between diets at similar N intake. Urea irreversible loss exceeded urinary urea excretion by 35—50% on all but the late-maturity sainfoin diet, where urea irreversible loss was more than double the urinary urea output. These data indicate dietary differences in the extent of degradation of urea on recycling to the gastro-intestinal tract. Urea clearance across the kidney was also lowest for sainfoin.
Data are compared in a simple model which illustrates the importance of variable clearance of urea across the kidney and the gut wall and the need for knowledge of factors which control this.