We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The modern idea of purebred dogs has come under increasing critical scrutiny over recent decades. In light of this critical focus and other developments in society, some new trends in how companion dogs are bred and acquired have emerged. This means a diminishing influence from traditional kennel clubs with more dogs being sold without a pedigree, stricter legal restrictions on dog breeding, growing popularity of deliberate crosses of established breeds (i.e. so-called designer breeds) and growing hype around the benefits of mixed-breed dogs. We give an overview of these trends and discuss to what extent they will serve to promote dogs that are innately healthy, have good welfare and function well in their various roles in today’s world. We argue that newly invented designer breeds and mixed breeds also have worrying health and behavioural problems, and that the predictability of purebred dogs with respect to body size, basic behaviours, known need for grooming, disorder profiles and other attributes may well offer some benefits for a satisfying human-dog relationship seen from both sides. The optimal future seems to lie in the middle ground, where the future organised dog world (i.e. kennel and breed clubs or their successor organisations) will need to re-open the breed registries, remove wording from breed standards that currently promotes extreme conformation, support selection against disease-predisposing genotypes and phenotypes and refocus dog showing and breeding to promote health and appropriate behaviour.
Rapid tranquillisation – the parenteral administration of a sedating psychotropic – is frequently utilised to manage acute behavioural disturbances. Each mental health trust in England utilises independent guidelines for rapid tranquillisation, which vary geographically in both recommendations for therapeutic agents, as well as the format in which this information presented. Audits have identified that there is currently poor adherence to rapid tranquillisation protocol guidelines; this may be due to a lack of guideline clarity allowing for personal interpretation. This service evaluation aims to determine the clarity and uniformity of protocols outlined in mental health trust guidelines, in addition to analysing the outcomes of guideline testing to identify if there is consistency between policies, or whether outcomes varied depending on the trust guidelines used.
Methods
Five reviewers (of differing positions throughout clinical training) utilised 52 guidelines from each mental health trust in England, as well as Maudsley and NICE. These were assessed using the same fictional scenario, which simulated a common presentation in which the use of rapid tranquillisation is required. Reviewers deduced the most appropriate therapeutic agent according to the guideline, rated the clarity of each guideline and were invited to leave comments highlighting the guideline's useability.
Results
Seven different management plans were generated by the majority of respondents from the 52 guidelines. Lorazepam was the most frequently selected therapeutic agent. Guidelines with better subjective ratings of clarity had more agreement between reviewers, but full agreement between reviewers was only present for 10 out of 52 guidelines. For 11 guidelines, consensual agreement between reviewers was not reached. Qualitative analysis of comments identified the inclusion of past medical history, drug history and flow charts as positive sub-themes. Redundant language, contradictions and the suggestion to seek senior intervention before trialling a second agent were viewed negatively. Many guidelines did not sufficiently emphasise the need for performing an ECG before administering therapeutic agents, such as haloperidol, which may lead to potentially fatal arrhythmias.
Conclusion
There is no national consensus on the most appropriate rapid tranquillisation agents, with the available evidence being interpreted variously by different trusts and organisations. Poor guideline comprehensibility impacts clinician adherence and allows for personal preference to influence choice of drug. Clear guidelines utilising flow charts to succinctly outline relevant doses and absolute contraindications were viewed favourably by reviewers. The findings of this project highlights to relevant stakeholders the attributes that should be implemented when improving guidelines for the future.
Mineralogical (XRD), morphological (transmission electron microscopy), chemical (major, rare-earth elements, and scanning-transmission electron microscopy), and isotope (Sr, O, H) measurements were made of marine detrital smectite from shales to study their reactions during early diagenesis. Albian, Aptian, and Palaeogene smectite samples were selected from Deep Sea Drilling Project drill cores taken in the Atlantic Ocean and from outcrops and drill cores from Belgium and northern France. Detrital, fake-like smectite particles seem to have adapted to their depositional environment by isochemical dissolution and subsequent crystallization of authigenic, lath-like particles. The major-element and rare-earth element compositions of both types of particles are similar. The Sr isotope chemistry suggests that the dissolution-crystallization process occurred soon after deposition in an almost closed chemical system. Except for slight changes in the amount of Fe and the oxygen isotope composition, the reaction took place without noticeable chemical exchange with the interstitial or marine environment. Such closed-system recrystallization of clay minerals may be a common diagenetic process if the water/rock ratio is small, as in shales.
Sediments from depths to 670 m in the Barbados accretionary complex and transecting the décollement zone have been studied by transmission and analytical electron microscopy (TEM/AEM). The sediments consist of claystone and mudstone intercalated with layers of volcanic ash. Smectite comprises the bulk of the noncalcareous sediments and forms a continuous matrix enveloping sparse, irregular, large grains of illite, chlorite, kaolinite and mixed-layer illite/chlorite of detrital origin at all depths. The detrital origin of illite is implied by illite-smectite textural relations, well-ordered 2M polytypism, and a muscovite-like composition. K is the dominant interlayer cation in smectite at all depths, in contrast to the Na and Ca that are normally present in similar rocks.
Deeper samples associated with the décollement zone contain small (up to 100 Å thick) illite packets included within still-dominant subparallel layers of contiguous smectite. AEM analyses of these packets imply illite-like compositions. Selected area electron diffraction (SAED) patterns show that this illite is the 1Md polytype. Packets display step-like terminations like those seen in illite of hydrothermal origin. The data collectively demonstrate that smectite transforms progressively to illite via a dissolution-recrystallization process within a depleting matrix of smectite, and not by a mechanism of layer replacement. This illite seems to form at depths as shallow as 500 m and temperatures of 20°-30°C, which is in marked contrast to the much higher temperature conditions normally assumed for this transformation. This implies that the high water/rock ratios associated with the décollement zone are significant in promoting reaction.
The compositions, fabrics and structures of authigenic minerals that formed recently from silicic volcanic ash layers from a 1300-meter sediment column obtained at ODP Site 808 of the Nankai Trough were studied using XRD, STEM, AEM and SEM. Smectite and zeolites were first detected as alteration products of volcanic glass with increasing depth, as follows: smectite at 200 m below seafloor (mbsf) (20 °C), clinoptilolite at 640 mbsf (60 °C) and analcime at 810 mbsf (75 °C).
A primitive clay precursor to smectite was observed as a direct alteration product of glass at 366 mbsf (approximately 30 °C). High defect smectite with lattice fringe spacings of 12 to 17 Å and having a cellular texture filling pore space between altering glass shards occurs at 630 mbsf. Packets of smectite become larger and less disordered with increasing depth and temperature. The smectite that forms as a direct alteration product of volcanic glass has K as the dominant interlayer cation.
With increasing depth, smectite becomes depleted in K as the proportion of clinoptilolite increases, and then becomes depleted in Na as the proportion of analcime increases. The composition of the exchangeable interlayer of smectite appears to be controlled by the formation first of K-rich clinoptilolite and then Na-rich analcime, via the pore fluid, giving rise at depth to Ca-rich smectite. Smectite reacted to form illite in the interbedded shales but not in the bentonites. Paucity of K in smectite and pore fluids, due to formation of clinoptilolite under closed system conditions, is believed to have inhibited the reaction relative to shales.
Empowering the Participant Voice (EPV) is an NCATS-funded six-CTSA collaboration to develop, demonstrate, and disseminate a low-cost infrastructure for collecting timely feedback from research participants, fostering trust, and providing data for improving clinical translational research. EPV leverages the validated Research Participant Perception Survey (RPPS) and the popular REDCap electronic data-capture platform. This report describes the development of infrastructure designed to overcome identified institutional barriers to routinely collecting participant feedback using RPPS and demonstration use cases. Sites engaged local stakeholders iteratively, incorporating feedback about anticipated value and potential concerns into project design. The team defined common standards and operations, developed software, and produced a detailed planning and implementation Guide. By May 2023, 2,575 participants diverse in age, race, ethnicity, and sex had responded to approximately 13,850 survey invitations (18.6%); 29% of responses included free-text comments. EPV infrastructure enabled sites to routinely access local and multi-site research participant experience data on an interactive analytics dashboard. The EPV learning collaborative continues to test initiatives to improve survey reach and optimize infrastructure and process. Broad uptake of EPV will expand the evidence base, enable hypothesis generation, and drive research-on-research locally and nationally to enhance the clinical research enterprise.
The management of unresponsive post-cardiac arrest patients has been an area of much controversy and research. Interventions during the arrest, particularly no-pause cardiopulmonary resuscitation and the advent of the automated external defibrillator have placed patients in a better position to respond to post-resuscitative measures. The initiation of post-arrest therapeutic hypothermia protocols, now referred to as targeted temperature management (TTM), has resulted in significant improvement in the number of neurologically intact survivors. Post-arrest care is centered on TTM, but includes a series of protocolized steps that define the measures to be taken to optimize outcome in these patients. Post-arrest care is the fifth and final link in the American Heart Association’s out-of-hospital chain of survival.
The Psychiatric Resident On-Call (PROC) rota provides medical cover for all inpatients across Leeds and York Partnership NHS Foundation Trust outside normal working hours. With the introduction of a new regional inpatient CAMHS unit in August 2021, a service evaluation project was undertaken to establish if the current medical provision was sufficient to meet the increased demand of expanding services.
Methods
Workload monitoring was undertaken for 28 days during August and September 2022 for all evening, weekend, night and bank holiday shifts. Data collection documents in the form of Microsoft Excel spreadsheets were sent to one doctor to co-ordinate for all PROC doctors on each shift. For each 30-minute period, the number of doctors engaged in clinical activity was documented and the average number working at that time, as well as standard deviation, was calculated.
Results
51 out of 56 on-call shifts were accounted for during the workload monitoring period by returning of a completed data collection document. Workforce demand for the remaining five shifts was estimated from reviewing handover document with listed times of call-out and expected duration for each job.
Data showed that workload was consistent throughout weekend shifts, but slowed around the time of handover. This is likely due to the end of shift being used to complete documentation, and the start of shift being used to assign roles and plan the shift ahead. In addition, lengthy non-urgent tasks may not have been appropriate to undertake if a PROC was due to shortly end their shift.
Patterns of night-shift working suggested a steady demand during the early hours of the shift, but a reduction during early hours of the morning, with trough levels being observed between 04:00 and 05:00 in the morning. No significant differences were observed between evening and night shifts across weekdays or weekends.
Conclusion
Assessing the above data led the authors to conclude two changes to workforce provision which may increase efficiency of workload. The first was to implement a cross-over role which could bridge periods of handover and ensure that a medic is still available to respond to tasks despite the change in workforce around these times. The second was to rebalance allocated provisions so that less medics were on shift during early hours of the morning, when demand was lowest, and re-allocated to evenings or weekends where demand appeared to be greater.
Service users taking long-acting injectable antipsychotics (LIAs) may experience recurrence of symptoms as they approach trough levels within a steady-state cycle. Limited research exists around symptom variation between peak-to-trough plasma concentrations of LIA inter-dose intervals. Different LIAs have variable rates of change in dopamine receptor occupancy during this peak-to-trough variation due to differing elimination half-lifes. It is unclear what rate of change in D2 blockade is tolerated by patients at present, which this trial aims to determine through observing symptom severity differences during peak-to-trough variation.
Methods
A real-world observational longitudinal cohort study is proposed. Inclusion criteria would be working-age adults (18–65 years) who have received five consecutive and timely LIA administrations of a consistent drug and dose. The study would exclude anyone with significant hepatic or renal impairment, anyone on concurrent oral antipsychotic medication or anyone deemed not to yet be within steady-state plasma levels of their LIA medication.
Serum assays for drug level will be obtained at both peak and trough concentrations during an LIA cycle. Expected timings for peak levels will be determined by derived tmax values from existing pharmacokinetic literature for individual drugs. Trough levels will be taken within 24 hours of the next LIA administration being due. Plasma drug concentrations will then be used to calculate expected striatal D2 blockade using EC50 values and maximal occupancy for individual drugs derived from existing PET scan data.
Symptom severity will be assessed by completing Positive and Negative Symptom Scores (PANSS) questionnaires with service users at the time of both peak and trough plasma concentrations of LIA. The difference in these scores will then be plotted alongside the difference in expected D2 blockade derived from plasma drug concentrations.
Results
We hypothesize that the rate of D2 occupancy change would correlate with symptom severity differences in an exponential manner, in that drugs with shorter elimination half-life would have greater difference in symptom severity between peak and trough. We expect that service users would be able to tolerate such change to a degree without significant emergence of symptoms; the trial aims to determine the threshold for what most service users can tolerate, which may then assist in guiding how to effectively reduce and discontinue medications.
Conclusion
This outlines a research protocol to monitor response to pharmacokinetic variation within inter-dose intervals of LIA medication, which may ultimately aid service users in reducing and discontinuing antipsychotics.
This study aimed to review if clinicians varied significantly in choosing rapid tranquillisation agents when using consistent clinical guidelines, analysing the rationale behind decision-making. It also aimed to assess confidence across varying grades and clinical experience, and to evaluate efficacy of current trust guidelines. We hypothesized that less experienced clinicians would be less willing to prescribe antipsychotics for rapid tranquillisation, and that current guidelines did not allow for consistent and uniform prescribing.
Methods
A qualitative survey was distributed to 165 clinicians within one mental health trust, including core psychiatry trainees, trust-grade doctors, higher trainees, staff-grade doctors & working-age adult consultants. This survey included a fictional but commonly occurring scenario which clinicians responded to with the aid of current trust guidance. Respondents were then asked to justify their choice and to rank their confidence in prescribing rapid tranquillisation, along with rating how useful the guideline was in aiding their decision. Thirty-six participants responded to this survey, with a response rate of around 22%. There was even representation across clinical grades.
Results
Clinicians of all grades were equally willing to prescribe antipsychotic agents for rapid tranquillisation. Higher psychiatric trainees reported greatest self-confidence when prescribing tranquillisation, with consultants surprisingly lower in confidence. Intramuscular olanzapine was most favourable, but significant variability was observed in suggested management between clinicians. Main themes for suggested amendments to the guideline included clarity, when to use the various options, further specification on dosage ranges and options for specific instances, such as if a patient is antipsychotic naïve or there is minimal physical health information.
There was marked variability in choice of agent. The majority of clinicians felt that early commencement of antipsychotic was beneficial in acutely unwell patients, although the merits of initially assessing medication-free were also raised. Key themes for tranquillisation choice included a need for a prior electrocardiogram to prescribe intramuscular haloperidol, the potential lack of efficacy with aripiprazole, the risk of respiratory depression with concurrent olanzapine and lorazepam, and a surprisingly high proportion of respondents opting for combined use of haloperidol plus a further sedative.
Conclusion
Less experienced clinicians were not found to lack confidence to prescribe antipsychotics for rapid tranquillisation. However, clinicians responding to the same clinical scenario using the same guideline resulted in marked variability in choosing rapid tranquillisation agents. This highlights a need for clearer guidelines and education on this matter to ensure a consistent treatment approach to tranquillising medication.
The Nine Years War, also known as Tyrone's Rebellion, raged across Ireland for ten years from 1593–1603, as a confederation of Irish lords led by Hugh O'Neill, second earl of Tyrone, almost succeeded in extinguishing English power in Ireland. It retains a popular image of being a guerrilla war by Irish lords to throw off English rule that was ultimately doomed to failure, since primitive Ireland could never hope to match the economic and military strength of Gloriana's England. The conflict has often been portrayed as a no-holds-barred conflict in which brutality was the norm. The orgy of bloodshed and cataclysmic famine in Ulster which brought the war to its close helped cement this image. The war in Ireland generated a reputation in Europe as one of uncommon savagery, with subsequent English publications attributing the bulk of that aggression and cruelty to the Irish. This appears very neat and uncontentious. However, the narrative is riven with fabrications which began during the war and have proliferated in modern historiography.
Many assumptions about the nature and course of the war are not borne out by the evidence, yet they continue to permeate the historiography of the conflict. Two of the most pervasive strands of this myth are of particular concern: the primitivism of the Irish, and the savagery of the English campaign to suppress the Gaelic lords. In truth, the Irish were anything but primitive, and in many ways were more advanced and adaptive than their English adversaries. Nevertheless, both English and Irish writers sought to portray the Irish as crude and unsophisticated. The modern critical vogue for equating the closing stages of the war with genocide is equally founded upon questionable sources. A reassessment of the contemporary accounts suggests the horrors in Ulster were overstated at the time, and later interpretations have emphasised aspects of the devastation caused by war to apportion blame to one side of the conflict, in this case the English. Whilst English officers and soldiers did not view themselves as bloodthirsty or acting contrary to accepted norms, the popular and academic interpretation of the conflict has often been one of untrammelled cruelty by crown forces.
Adjuvant effects on disease severity caused by the bioherbicide P. papaveracea on opium poppy were evaluated. Tween 20, Tween 80, Triton X-100, Tactic, CelGard, and Keltrol inhibited appressorium formation but not conidial germination on detached leaves. The disease severity varied from 11 to 83% necrosis in field experiments involving eight adjuvants at various concentrations plus 1 × 106 conidia ml−1 or minus pathogen. The three best-performing adjuvants when combined with pathogen, Tactic (1%, v/v), Bond (1%, v/v), and Tween 20 (1%, v/v), were included along with Tween 20 (0.001%, v/v) in field experiments in 1998. Tween 20 (1%, v/v) plus pathogen (1 × 106 conidia ml−1) caused the most severe disease, averaging 68% necrosis within 2 wk of treatment. Overall, plots treated with adjuvant plus P. papaveracea had a 22% reduction in capsule weight per plot as compared to plots treated with the adjuvant alone. Tactic (1%, v/v), Silwet-L77 (0.1%, v/v), Tween 20 (1%, v/v), and Tween 20 (0.001%, v/v) were included in field experiments in 1999. The treatment with Tween 20 (1%, v/v) plus pathogen (2 × 106 conidia ml−1) caused severe disease, averaging 56% necrosis within 2 wk of treatment. In 1999 plots treated with adjuvant plus pathogen averaged a 27% reduction in capsule weight as compared with plots treated with the adjuvants alone. The inclusion of Tween 20 (1%, v/v) with P. papaveracea conidia greatly enhanced efficacy on opium poppy.
The amphibian pathogen Batrachochytrium dendrobatidis (Bd) has recently emerged as a primary factor behind declining global amphibian populations. Much about the basic biology of the pathogen is unknown, however, such as its true ecological niche and life cycle. Here we evaluated invertebrates as infection models by inoculating host species that had previously been suggested to be parasitized in laboratory settings: crayfish (Procambarus alleni) and nematodes (Caenorhabditis elegans). We found neither negative effects on either host nor evidence of persistent infection despite using higher inoculum loads and more pathogen genotypes than tested in previous studies. In contrast, addition of Bd to C. elegans cultures had a slight positive effect on host growth. Bd DNA was detected on the carapace of 2/34 crayfish 7 weeks post-inoculation, suggesting some means of persistence in the mesocosm. These results question the role of invertebrates as alternative hosts of Bd and their ability to modulate disease dynamics.
Healthcare provider hands are an important source of intraoperative bacterial transmission events associated with postoperative infection development.
OBJECTIVE
To explore the efficacy of a novel hand hygiene improvement system leveraging provider proximity and individual and group performance feedback in reducing 30-day postoperative healthcare-associated infections via increased provider hourly hand decontamination events.
DESIGN
Randomized, prospective study.
SETTING
Dartmouth-Hitchcock Medical Center in New Hampshire and UMass Memorial Medical Center in Massachusetts.
PATIENTS
Patients undergoing surgery.
METHODS
Operating room environments were randomly assigned to usual intraoperative hand hygiene or to a personalized, body-worn hand hygiene system. Anesthesia and circulating nurse provider hourly hand decontamination events were continuously monitored and reported. All patients were followed prospectively for the development of 30-day postoperative healthcare-associated infections.
RESULTS
A total of 3,256 operating room environments and patients (1,620 control and 1,636 treatment) were enrolled. The mean (SD) provider hand decontamination event rate achieved was 4.3 (2.9) events per hour, an approximate 8-fold increase in hand decontamination events above that of conventional wall-mounted devices (0.57 events/hour); P<.001. Use of the hand hygiene system was not associated with a reduction in healthcare-associated infections (odds ratio, 1.07 [95% CI, 0.82–1.40], P=.626).
CONCLUSIONS
The hand hygiene system evaluated in this study increased the frequency of hand decontamination events without reducing 30-day postoperative healthcare-associated infections. Future work is indicated to optimize the efficacy of this hand hygiene improvement strategy.
We report results of an experimental investigation into the effects of small-scale (mm–cm) heterogeneities on solute spreading and mixing in a Berea sandstone core. Pulse-tracer tests have been carried out in the Péclet number regime $Pe=6{-}40$ and are supplemented by a unique combination of two imaging techniques. X-ray computed tomography (CT) is used to quantify subcore-scale heterogeneities in terms of permeability contrasts at a spatial resolution of approximately $10~\text{mm}^{3}$, while [11C] positron emission tomography (PET) is applied to image the spatial and temporal evolution of the full tracer plume non-invasively. To account for both advective spreading and local (Fickian) mixing as driving mechanisms for solute transport, a streamtube model is applied that is based on the one-dimensional advection–dispersion equation. We refer to our modelling approach as semideterministic, because the spatial arrangement of the streamtubes and the corresponding solute travel times are known from the measured rock’s permeability map, which required only small adjustments to match the measured tracer breakthrough curve. The model reproduces the three-dimensional PET measurements accurately by capturing the larger-scale tracer plume deformation as well as subcore-scale mixing, while confirming negligible transverse dispersion over the scale of the experiment. We suggest that the obtained longitudinal dispersivity ($0.10\pm 0.02$ cm) is rock rather than sample specific, because of the ability of the model to decouple subcore-scale permeability heterogeneity effects from those of local dispersion. As such, the approach presented here proves to be very valuable, if not necessary, in the context of reservoir core analyses, because rock samples can rarely be regarded as ‘uniformly heterogeneous’.
Convincing evidence has identified inflammation as an initiator of atherosclerosis, underpinning CVD. We investigated (i) whether dietary inflammation, as measured by the ‘dietary inflammatory index (DII)’, was predictive of 5-year CVD in men and (ii) its predictive ability compared with that of SFA intake alone. The sample consisted of 1363 men enrolled in the Geelong Osteoporosis Study who completed an FFQ at baseline (2001–2006) (excluding participants who were identified as having previous CVD). DII scores were computed from participants’ reported intakes of carbohydrate, micronutrients and glycaemic load. DII scores were dichotomised into a pro-inflammatory diet (positive values) or an anti-inflammatory diet (negative values). The primary outcome was a formal diagnosis of CVD resulting in hospitalisation over the 5-year study period. In total, seventy-six events were observed during the 5-year follow-up period. Men with a pro-inflammatory diet at baseline were twice as likely to experience a CVD event over the study period (OR 2·07; 95 % CI 1·20, 3·55). This association held following adjustment for traditional CVD risk factors and total energy intake (adjusted OR 2·00; 95 % CI 1·03, 3·96). This effect appeared to be stronger with the inclusion of an age-by-DII score interaction. In contrast, SFA intake alone did not predict 5-year CVD events after adjustment for covariates (adjusted OR 1·40; 95 % CI 0·73, 2·70). We conclude that an association exists between a pro-inflammatory diet and CVD in Australian men. CVD clinical guidelines and public health recommendations may have to expand to include dietary patterns in the context of vascular inflammation.