We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The introduction of ‘no-fault divorce’ in Australia in 1976 promised unhappy spouses a ‘dignified’ way to end their marriage without the need to assign responsibility for the relationship's demise. But in 1989, Richard Ingleby's study of matrimonial breakdown hinted that the reformers of the 1970s had failed to appreciate the non-mutuality of the uncoupling process, and that the Family Law Act 1975 (Cth) ('Family Law Act’) had not been ‘able to prevent divorcing parties feeling the need to consider issues of fault'. Since that time a growing body of research evidence has revealed that blame and recrimination remain salient issues for divorcing couples, and academic commentators have suggested that battles over children and property are often proxies for unresolved relationship grievances. In this climate, legal scholarship has witnessed a renewed interest in the issue of spousal conduct, with radical proposals to construct ‘disloyalty’ in marriage as a legally relevant matter in divorce settlements, and consumers of the family law system have called for the law to pay greater heed to the moral dimensions of intimate relationships.
The 2022 Commonwealth Games (B2022) was hosted by Birmingham, United Kingdom (UK) from July, 28 2022 to August 8, 2022. As a major global sporting event and mass gathering, B2022 included over 4,500 athletes (from 72 countries and territories) and attracted 1.5 million spectators. Robust public health surveillance and support for health protection incidents was required from the UK Health Security Agency (UKHSA) to protect the health of both those directly involved in B2022, and the local population.
Method:
UKHSA surveillance activities in the UK West Midlands region were enhanced, utilizing lessons learned from the response to the London 2012 Olympic and Paralympic Games and the 2021 G7 Summit (hosted in England). Enhancements included: adaptation of existing and development of new methods for the identification of increased activity of a range of pathogens/diseases/conditions of particular concern to a mass gathering; standardized daily situation reporting to inform both public health action and the B2022 organizing committee. Three streams of routine UKHSA surveillance data were assessed each day: a UKHSA health protection/clinical management system, statutory laboratory reports of infection, and syndromic surveillance. Bespoke surveillance was also implemented using B2022 health data sources.
Results:
Enhanced daily surveillance activities successfully met the need for next-day public health surveillance and reporting during B2022. No outbreaks or incidents of public health significance to the Games were identified. Syndromic surveillance reported an increased impact on local health services due to periods of extremely hot weather before and following the competition period, although these impacts were not unique to the Birmingham area.
Conclusion:
Surveillance and epidemiology reporting for B2022 provided reassurance there were no incidents/outbreaks of public health significance to the Games. The enhancements made will inform future routine surveillance and reporting activities and will be employed for similar activities during future mass gathering events.
To investigate whether a psychiatry-specific virtual on-call training programme improved confidence of junior trainees in key areas of psychiatry practice. The programme comprised one 90 min lecture and a 2 h simulated on-call shift where participants were bleeped to complete a series of common on-call tasks, delivered via Microsoft Teams.
Results
Thirty-eight trainees attended the lecture, with a significant improvement in confidence in performing seclusion reviews (P = 0.001), prescribing psychiatric medications for acute presentations (P < 0.001), working in section 136 suites (places of safety) (P = 0.001) and feeling prepared for psychiatric on-call shifts (P = 0.002). Respondents reported that a virtual on-call practical session would be useful for their training (median score of 7, interquartile range 5–7.75). Eighteen participants completed the virtual on-call session, with significant improvement in 9 out of the 10 tested domains (P < 0.001).
Clinical implications
The programme can be conducted virtually, with low resource requirements. We believe it can improve trainee well-being, patient safety, the delivery of training and induction of rotating junior doctors during the COVID-19 pandemic and it supports the development and delivery of practical training in psychiatry.
Out-of-hours (‘on-call’) work can be perceived by junior doctors to be a daunting experience, associated with feeling unprepared and less supported. Simulated on-call programmes have been used to great effect in medicine and surgery to improve junior doctors’ skills in task prioritisation, interpersonal communication and confidence on-call. However, few psychiatry-specific programmes exist.
We aimed to: i) Develop a psychiatry specific virtual-on-call programme, ii) Investigate if the virtual-on-call programme improved confidence amongst junior trainees in key areas of psychiatry practice.
Method
The Psychiatry Virtual-On-Call programme commenced in December 2020. It involves attending an introductory on-call lecture, followed later in the rotation by a 2-hour simulated on-call shift. All trainees are expected to attend during their attachment and the simulated shifts are ongoing. During the shift, trainees are ‘bleeped’ with different psychiatry specific tasks. They work through the tasks, using local intranet policies and telephone advice from the on-call psychiatry registrar. Due to COVID-19 the sessions were delivered virtually. Participants completed a questionnaire evaluating confidence in ten domains, rated on a Likert scale from 0–10. Questionnaires were completed at four time-points during the programme; pre- and post-introductory lecture and pre- and post-simulated shift. Scores were compared using Mann-Whitney U tests. Significance was defined as P < 0.05 with Bonferroni correction applied for multiple testing.
Result
Twenty-nine trainees attended the introductory lecture, 25 and 21 trainees completed the pre- and post-lecture questionnaire respectively. A non-significant improvement in confidence was reported in three domains: seclusions reviews, prescribing, detention under the mental health act.
At the time of writing, ten trainees had attended the on-call shift. All participants completed a pre- and post-session questionnaire. The on-call shift was a useful learning experience (median score 9), and significantly increased perceived preparedness for on-call work from 3/10 to 7/10 (p < 0.001). Confidence was significantly improved in seven domains, most markedly in seclusion reviews, prescribing and mental health act tasks.
Conclusion
The psychiatry virtual-on-call programme fills a niche in the training curriculum and is perceived by trainees to be a useful learning experience. The introductory lecture improved confidence in several domains, but not as effectively as the on-call shift. The on-call shift was well received by participants and significantly improved confidence in 7/10 domains. In summary, the virtual-on-call experience improves preparedness for out-of-hours psychiatry work. Follow-up of participants at the end of their psychiatry rotation will ascertain if they felt the programme to be useful during out-of-hours work.
Around 60 000 people in England live in mental health supported accommodation. There are three main types: residential care, supported housing and floating outreach. Supported housing and floating outreach aim to support service users in moving on to more independent accommodation within 2 years, but there has been little research investigating their effectiveness.
Aims
A 30-month prospective cohort study investigating outcomes for users of mental health supported accommodation.
Method
We used random sampling, accounting for relevant geographical variation factors, to recruit 87 services (22 residential care, 35 supported housing and 30 floating outreach) and 619 service users (residential care 159, supported housing 251, floating outreach 209) across England. We contacted services every 3 months to investigate the proportion of service users who successfully moved on to more independent accommodation. Multilevel modelling was used to estimate how much of the outcome and cost variations were due to service type and quality, after accounting for service-user characteristics.
Results
Overall 243/586 participants successfully moved on (residential care 15/146, supported housing 96/244, floating outreach 132/196). This was most likely for floating outreach service users (versus residential care: odds ratio 7.96, 95% CI 2.92–21.69, P < 0.001; versus supported housing: odds ratio 2.74, 95% CI 1.01–7.41, P < 0.001) and was associated with reduced costs of care and two aspects of service quality: promotion of human rights and recovery-based practice.
Conclusions
Most people do not move on from supported accommodation within the expected time frame. Greater focus on human rights and recovery-based practice may increase service effectiveness.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
METHODS:
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
RESULTS:
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
CONCLUSIONS:
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
METHODS:
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
RESULTS:
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
CONCLUSIONS:
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
Personal health budgets (PHBs) were piloted in the National Health Service (NHS) in England between 2009 and 2012 and were found to have greater positive effects on quality of life and psychological well-being for those with mental health problems than commissioned service, as well as reducing their use of unplanned care. The government intends to extend PHBs in England for long-term conditions, including mental health, from April 2015. Given the importance of engaging clinicians in the next phase of PHB development, we provide an overview of the approach, synthesise the evidence from the national pilot and debate some of the opportunities and challenges. Balancing individual choice and recovery with concerns for risk, equity and the sustainability of existing community services is the central tension underpinning this innovation in mental health service delivery.
Current health policy assumes better quality services lead to better outcomes.
Aims
To investigate the relationship between quality of mental health rehabilitation services in England, local deprivation, service user characteristics and clinical outcomes.
Method
Standardised tools were used to assess the quality of mental health rehabilitation units and service users' autonomy, quality of life, experiences of care and ratings of the therapeutic milieu. Multiple level modelling investigated relationships between service quality, service user characteristics and outcomes.
Results
A total of 52/60 (87%) National Health Service trusts participated, comprising 133 units and 739 service users. All aspects of service quality were positively associated with service users' autonomy, experiences of care and therapeutic milieu, but there was no association with quality of life.
Conclusions
Quality of care is linked to better clinical outcomes in people with complex and longer-term mental health problems. Thus, investing in quality is likely to show real clinical gains.
Purpose: To evaluate the accuracy of an external immobilisation system in patients receiving radiotherapy for prostate cancer.
Methods: Portal Imaging data were audited in 20 patients treated using an in-house immobilisation system and 20 patients treated using an indexed commercial immobilisation system (Combifix™). Individual and group random and systematic errors were calculated to determine the accuracy of set-up using skin marks alone and with a no-action-level protocol.
Results: The initial results showed a larger systematic error in the Combifix™ in the anterior-posterior direction (2.7 mm) compared with the in-house system (1.5 mm). The possible source of this was identified as the difficulty in accurately aligning the laser to a curved couch top prior to setting the isocentre height. A change in the process of setting the isocentre was introduced, and comparable baseline set-up accuracy was achieved. This was with a systematic error of ≤2.0 mm and a random error ≤1.5 mm of patient position set-up error with skin marks alone, and using the Combifix™. The systematic errors were further reduced to <1 mm with an off-line no-action-level protocol.
Conclusion: Using the Combifix™ system a high level of set-up accuracy was reproduced in routine daily practice.
Orang-utans (Pongo spp.) are primarily frugivorous (Morrogh-Bernard et al. 2009) and are often regarded as important seed dispersers (Corlett 1998). In Tanjung Puting, Borneo, Galdikas (1982) found intact seeds in 94% of faecal samples, with a median 111 seeds per defecation; and in Ketambe, Sumatra, Rijksen (1978) found seeds in 44% of faecal samples. Furthermore, orang-utans have large day ranges (e.g. mean = 968 m, range = 280–2834 m across adults in Sabangau; Harrison 2009) and slow passage rates of digesta through the gut (Caton et al. 1999), and, hence, may disperse seeds far from parent trees. Many seeds are also spat out or discarded at distances up to 75 m from parent trees (Galdikas 1982).
This article describes the introduction into physiotherapy practice of Benesh Movement Notation, a method for recording observations of patients’ posture and movement sequences. Assessments of the technique show it to be a reliable and practical tool for clinical use.
By
Helen Blair Simpson, New York State Psychiatric Institution New York, NY USA,
Phil Harrison-Read, Department of Psychiatry Royal Free Hospital London UK
Edited by
Peter Tyrer, Imperial College of Science, Technology and Medicine, London,Kenneth R. Silk, University of Michigan, Ann Arbor
You will note from Part I that obsessive-compulsive disorder, formerly called obsessional neurosis before the word neurosis was eliminated from usage, has the highest clinical utility score of the disorders within the neurotic spectrum. It is therefore not surprising that we have much clearer guidelines and evidence for treatment than for others within the spectrum. The arguments in favour of both drug and psychological treatments are strong and are often complementary across the range of clinical indications in OCD. There are some disparities between the UK and USA but most are matters of emphasis rather than fundamental disagreements. We need more evidence from studies on combinations of drug and psychological interventions as this is a very common position in clinical practice.
Introduction
Obsessive-compulsive disorder (OCD) is a relatively common, usually chronic and sometimes very disabling condition, which by convention excludes clinically similar syndromes caused by psychoactive drugs or by a general medical condition. It is characterized by obsessions, compulsions or a combination of both. Obsessions are recurrent and persistent ideas, images or impulses that cause pronounced anxiety and which are usually recognized by the person affected as being self-produced, and yet irrational. Compulsions are intentional repetitive behaviours or mental acts (rituals) performed in response to obsessions or in response to self-imposed rules which are aimed at reducing distress or at preventing unacceptable or anxiety-provoking outcomes. Compulsive rituals are usually recognized as unreasonable, pointless or time-wasting and are also usually resisted by the individual affected.
Cardiovascular disease is more prevalent in patients with severe mental illness (SMI) than in the general population.
Method
Seven geographically diverse centres were assigned a nurse to monitor the physical health of SMI patients in secondary care over a 2-year period in the “Well-being Support Programme” (WSP). A physical health screen was performed and patients were given individual weight and lifestyle advice including smoking cessation to reduce cardiovascular risk.
Results
Nine hundred and sixty-six outpatients with SMI >2 years were enrolled. The completion rate at 2 years was 80%. Significant improvements were observed in levels of physical activity (p < 0.0001), smoking (p < 0.05) and diet (p < 0.0001). There were no changes in mean BMI although 42% lost weight over 2 years. Self-esteem improved significantly. Low self-esteem decreased from 43% at baseline to 15% at 2 years (p < 0.0001). At the end of the programme significant cardiovascular risk factors remained, 46% of subjects smoked, 26% had hypertension and 81% had BMI >25.
Conclusion
Physical health problems are common in SMI subjects. Many patients completed 2 years follow up suggesting that this format of programme is an acceptable option for SMI patients. Cardiovascular risk factors were significantly improved. interventions such as the Well-being Support Programme should be made widely available to people with SMI.
Regular monitoring of the heart rate (HR):speed relationship may help evaluate response to training and aid in the early detection of problems. This relationship is normally determined using a treadmill or via a ridden test conducted outside on a track. Simple practical alternative methods to obtain this relationship without access to a treadmill or a track could be of value in the field. To evaluate whether the HR:speed relationship could be determined via an indoor ridden test or a lunge test, HR was monitored on two occasions at least 3 h apart, in 12 adult horses (mixed breed) in a familiar environment during a 5 or 7 m radius circle lunge (unridden) test (5LT or 7LT) and an incremental (ridden) test (RT) on the same day. The RT comprised two ridden laps of the perimeter of a 60 × 40 m indoor school at walk, three laps at trot, three at medium canter and four at fast canter (all on the right rein). The speed of each lap was recorded. The LT comprised lunging for 2 min on each rein at walk, trot and canter. Speed was determined from the number of laps completed and measurement of the distance travelled. HR and speed were highly correlated in both lunge and ridden tests (both r = 0.99 ± 0.01). V140 on the ridden test (5.2 ± 0.6 m s− 1) was significantly greater than on the pooled lunge test data (4.4 ± 0.6; P < 0.0001). There was a negative correlation between recovery HR at 2 min following either the LT or RT and V140 (P < 0.05). The slope of the HR versus speed relationship and V140 were not different between RT and 7LT, but were significantly different from those of the 5LT (P < 0.05). V140 was always lower on the lunge tests compared with the ridden test. This suggests that, in this study, lunging without a rider increased the metabolic demand above that for being ridden at a similar speed. V140 determined by the 7LT gave the closest approximation to the V140 determined by the RT. The HR:speed relationship can be obtained either from riding an incremental test in an indoor school or from an unridden lunge test.
Cardiovascular activity was measured at resting baseline and in response to a car racing game, undertaken in competition or in cooperation with an experimenter, or individually. Competitiveness and win and goal orientations were assessed by questionnaire. Competition provoked increases in blood pressure and heart rate, and a significant shortening of the preejection period, an index of enhanced beta-adrenergic influences on the heart. The cooperation task was largely without effect, and although the solo task affected cardiovascular activity, it did so to a lesser extent and much less consistently than did the competition task. The three task conditions, then, were largely distinguishable by their capacity to activate beta-adrenergic processes. Participants high in competitiveness and desire to win showed higher blood pressure reactions and greater shortening of the preejection period to competition than those low in these characteristics.
This paper describes how a teacher explored her teaching of an introduction to recorder playing to children, and how she tested her belief that music notation was an essential component of that teaching.
Two roughly parallel classes of 7 to 8 year-olds were introduced to recorder playing. One group was given tuition accompanied by music notation, the other group learned to play by ear. An interaction was found between the ability of the child and the relative success of a method of teaching. More able pupils became demotivated without access to written music, whilst less able pupils retained their interest when playing by ear. An intermediate strategy, using notation with the names of the notes written below, proved effective for those of average ability.
An holistic assessment of the quality of performance created by the two groups was independently assessed. Contrary to expectations the playing by ear group produced better quality sound than that of the group exposed to music notation.
The implication for the introduction of music performance to young children is discussed.
In the early 1930s, a significant number of American artists who were aligned, either practically or theoretically, with the Communist Party became supporters of the New Deal. Artist members of the John Reed Club, a Party-directed cultural organization, were enjoined to develop “revolutionary art” as a vehicle for the type of social change that had transformed tsarist Russia into the Soviet Union. Yet many of them found Roosevelt's “peaceful revolution” worthy of the highest accolade they could bestow on a subject: its inclusion as an affirmative theme in their work. In so viewing it, they ran counter to the Party's stated policy in opposition to socioeconomic reform—a policy that was later reversed to accommodate the New Deal and thus vindicate the artists's position. From its inception, the New Deal seemed to offer artists an attractive alternative to the forcible overthrow of all existing social conditions predicted by Marx and promulgated by the Communists.