We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Borderline personality disorder (BPD) is a highly stigmatised mental disorder. A variety of research exists highlighting the stigma experienced by individuals with BPD and the impacts of such prejudices on their lives. Similarly, much research exists on the benefits of engaging in compassionate acts, including improved mental health recovery. However, there is a notable gap in understanding how stigma experienced by people with BPD acts as a barrier to compassion and by extension recovery. This paper synthesises these perspectives, examining common barriers to compassionate acts, the impact of stigma on people with BPD, and how these barriers are exacerbated for individuals with BPD due to the stigma they face. The synthesis of perspectives in the article highlights the critical role of compassion in supporting the recovery of individuals with BPD, while also revealing the significant barriers posed by stigma. Addressing these challenges requires a comprehensive understanding of the intersection between compassion and stigma, informing the development of targeted interventions to promote well-being and recovery for individuals with BPD.
Background: Surgical delays are in common in Canada. Wait times in elective spine surgery and their impact on outcomes remain uncharacterized. Methods: This was a single-center analysis of elective spine surgery data between 2009-2020. Wait times between referral and consultation (T1), consultation and surgical booking (Ti), and booking and surgery (T2) were assessed. Results: 2041 patients were included. Longitudinal analyses were adjusted for age, sex, diagnosis, surgical volume, while outcomes analyses were age and sex-adjusted. Total T1+Ti+T2 increased 8.1% annually (p<0.001). T1 decreased 4.3% annually (p=0.032). It was not associated with adverse events (AEs) or disposition. Every 100 days of T1 was associated with 1.0% longer hospitalization (p=0.001). Ti increased 21.0% annually (p<0.001). Every 100 days of Ti was associated with 2.9% increased odds of an adverse event (p=0.002), 1.8% longer hospitalization (p<0.001), and 15.9% increased likelihood of discharge home (p<0.001). T2 increased 7.0% annually (p<0.001) and was not associated with AEs. Every 100 days of T2 was associated with 11.6% longer hospitalization (p<0.001) and 76.5% increased likelihood of discharge home (p<0.001). Conclusions: Total wait times for elective spine surgery have increased between 2009-2020. Notably, Ti increased ninefold and was associated with AEs. This study highlights areas of delay and targets for healthcare optimization.
Background: Mountain biking (MTB) is an increasingly popular sport that has been associated with serious spinal injuries, which can have devastating effects on patients and significant impacts on healthcare resources. Herein, we characterized the occurrence of these MTB spinal injuries over a 15-year period and analyzed the affiliated acute-care hospital costs. Methods: Patients seen at Vancouver General Hospital for MTB spinal injuries between 2008-2022 were retrospectively reviewed. Demographics, injury details, treatments, outcomes, and resource requirements for acute hospitalization were collected. The Canadian Institute for Health Information was referenced for cost analysis. Results: Over the 15 years of analysis, 149 MTB spinal injuries occurred. The majority (87.2%) were male. 59 (39.6%) were associated with spinal cord injury; most of these were in the cervical spine (72.3%) and majority were AIS Grade A (36.1%). 102 patients (68.5%) required spine surgery; 26 (17.4%) required intensive care; 34 (22.8%) required inpatient rehabilitation. Mean length of stay was 13.5 days and acute admission costs for the healthcare system averaged $35,251 (95% CI $27,080-$43,424). Conclusions: MTB spinal injuries are associated with significant medical, personal, and financial burden. As injury prevention remains paramount, further investigation of the roles of education and safety measures is recommended.
Cognitive impairments are well-established features of psychotic disorders and are present when individuals are at ultra-high risk for psychosis. However, few interventions target cognitive functioning in this population.
Aims
To investigate whether omega-3 polyunsaturated fatty acid (n−3 PUFA) supplementation improves cognitive functioning among individuals at ultra-high risk for psychosis.
Method
Data (N = 225) from an international, multi-site, randomised controlled trial (NEURAPRO) were analysed. Participants were given omega-3 supplementation (eicosapentaenoic acid and docosahexaenoic acid) or placebo over 6 months. Cognitive functioning was assessed with the Brief Assessment of Cognition in Schizophrenia (BACS). Mixed two-way analyses of variance were computed to compare the change in cognitive performance between omega-3 supplementation and placebo over 6 months. An additional biomarker analysis explored whether change in erythrocyte n−3 PUFA levels predicted change in cognitive performance.
Results
The placebo group showed a modest greater improvement over time than the omega-3 supplementation group for motor speed (ηp2 = 0.09) and BACS composite score (ηp2 = 0.21). After repeating the analyses without individuals who transitioned, motor speed was no longer significant (ηp2 = 0.02), but the composite score remained significant (ηp2 = 0.02). Change in erythrocyte n-3 PUFA levels did not predict change in cognitive performance over 6 months.
Conclusions
We found no evidence to support the use of omega-3 supplementation to improve cognitive functioning in ultra-high risk individuals. The biomarker analysis suggests that this finding is unlikely to be attributed to poor adherence or consumption of non-trial n−3 PUFAs.
Background: Mean arterial pressure augmentation is one current established practice for management of patients with SCI. We present the first data investigating the effectiveness of Intrathecal Pressure (ITP) reduction through CSF drainage (CSFD) in managing patients with acute traumatic SCI at a large academic center. Methods: Data from 6 patients with acute traumatic SCI were included. A lumbar intrathecal catheter was used to monitor ITP and volume of CSFD. CSFD was performed and recorded hourly. ITP recordings were collected hourly and the change in ITP was calculated (hour after minus before CSFD). 369 data points were collected and change in ITP was plotted against volume of CSFD. Results: Data across all patients showed variability in the ITP over time without a significant trend (slope=0.016). We found no significant change in ITP with varying amounts of CSFD (slope=0.007, r2=0.00, p=0.88). Changes in ITP were not significantly different across groups of CSFD but the variation in the data decreased with increasing levels of CSFD. Conclusions: We present the first known data on changes in ITP with varying degrees of CSFD in patients with acute traumatic SCI. These results may provide insight into the complexity of ITP changes in patients post-injury and help inform future SCI management.
Background: Length of stay (LOS) is a surrogate for care complexity and a determinant of occupancy and service provision. Our primary goal was to assess changes in and determinants of LOS at a quaternary spinal care center. Secondary goals included identifying opportunities for improvement and determinants of future service planning. Methods: This is a prospective study of patients admitted from 2006 to 2019. Data included demographics, diagnostic category (degenerative, oncology, deformity, trauma, other), LOS (mean, median, interquartile range, standard deviation) and in-hospital adverse events (AEs). Results: 13,493 admissions were included. Mean age has increased from 48.4 (2006) to 58.1 years (2019) (p=<0.001). Mean age increased overtime for patients treated for deformity (p=<0.001), degenerative pathology (p=<0.001) and trauma (p=<0.001), but not oncology (p=0.702). Overall LOS has not changed over time (p=0.451). LOS increased in patients with degenerative pathology (p=0.019) but not deformity (p=0.411), oncology (p=0.051) or trauma (p=0.582). Emergency admissions increased overtime for degenerative pathologies (p=<0.001). AEs and SSIs have decreased temporally (p=<0.001). Conclusions: This is the first North American study to analyze temporal trends in LOS for spine surgery in an academic center. Understanding temporal trends in LOS and patient epidemiology can provide opportunities for intervention, targeted at the geriatric populations, to reduce LOS.
Background: Prolonged length of stay (LOS) is associated with increased resource utilization and worse outcomes. The goal of this study is identifying patient, surgical and systemic factors associated with prolonged LOS overall and per diagnostic category for adults admitted to a quaternary spinal care center. Methods: We performed a retrospective analysis on 13,493 admissions from 2006 to 2019. Factors analyzed included patient age, sex, emergency vs elective admission, diagnostic category (degenerative, deformity, oncology, trauma), presence of neurological deficits in trauma patients, ASIA score, operative management and duration, blood loss, and adverse events (AEs). Univariate and multivariate analyses determined factors associated with prolonged LOS. Results: Overall mean LOS (±SD) was 15.80 (±34.03) days. Through multivariate analyses, predictors of prolonged LOS were advanced age (p<0.001), emergency admission (p<0.001), advanced ASIA score (p<0.001), operative management (p=0.043), and presence of AEs (p<0.001), including SSI (p=0.001), other infections (systemic and UTI) (p<0.001), delirium (p=0.006), and pneumonia (p<0.001). The effects of age, emergency admission, and AEs on LOS differed by diagnostic category. Conclusions: Understanding patient and disease factors that affect LOS provides opportunities for QI intervention and allows for an informed preoperative discussion with patients. Future interventions can be targeted to maximize patient outcomes, optimize care quality, and decrease costs.
A multi-disciplinary expert group met to discuss vitamin D deficiency in the UK and strategies for improving population intakes and status. Changes to UK Government advice since the 1st Rank Forum on Vitamin D (2009) were discussed, including rationale for setting a reference nutrient intake (10 µg/d; 400 IU/d) for adults and children (4+ years). Current UK data show inadequate intakes among all age groups and high prevalence of low vitamin D status among specific groups (e.g. pregnant women and adolescent males/females). Evidence of widespread deficiency within some minority ethnic groups, resulting in nutritional rickets (particularly among Black and South Asian infants), raised particular concern. Latest data indicate that UK population vitamin D intakes and status reamain relatively unchanged since Government recommendations changed in 2016. Vitamin D food fortification was discussed as a potential strategy to increase population intakes. Data from dose–response and dietary modelling studies indicate dairy products, bread, hens’ eggs and some meats as potential fortification vehicles. Vitamin D3 appears more effective than vitamin D2 for raising serum 25-hydroxyvitamin D concentration, which has implications for choice of fortificant. Other considerations for successful fortification strategies include: (i) need for ‘real-world’ cost information for use in modelling work; (ii) supportive food legislation; (iii) improved consumer and health professional understanding of vitamin D’s importance; (iv) clinical consequences of inadequate vitamin D status and (v) consistent communication of Government advice across health/social care professions, and via the food industry. These areas urgently require further research to enable universal improvement in vitamin D intakes and status in the UK population.
Bell's palsy is a lower motor neurone facial weakness of unknown aetiology, although reactivation of a virus within the facial nerve has been proposed.
Methods
A prospective study was conducted of Bell's palsy cases presenting to our paediatric ENT unit over a 19-week period, from February to June 2020. Patients were invited for severe acute respiratory syndrome coronavirus-2 antibody testing. A text-message questionnaire was sent to other ENT centres to determine their observational experience.
Results
During the study period, 17 children presented with Bell's palsy, compared with only 3 children in the same time period in the previous year (p < 0.0001). Five patients underwent severe acute respiratory syndrome coronavirus-2 antibody testing, the results of which were all negative. Four out of 15 centres questioned perceived an increased incidence in paediatric Bell's palsy.
Conclusion
Clinicians are encouraged to be vigilant to the increase in paediatric Bell's palsy seen during the coronavirus disease 2019 pandemic, which may represent a post-viral sequela of coronavirus disease 2019.
An intermediate-depth (1751 m) ice core was drilled at the South Pole between 2014 and 2016 using the newly designed US Intermediate Depth Drill. The South Pole ice core is the highest-resolution interior East Antarctic ice core record that extends into the glacial period. The methods used at the South Pole to handle and log the drilled ice, the procedures used to safely retrograde the ice back to the National Science Foundation Ice Core Facility (NSF-ICF), and the methods used to process and sample the ice at the NSF-ICF are described. The South Pole ice core exhibited minimal brittle ice, which was likely due to site characteristics and, to a lesser extent, to drill technology and core handling procedures.
Coronavirus disease 2019 personal protective equipment has been reported to affect communication in healthcare settings. This study sought to identify those challenges experimentally.
Method
Bamford–Kowal–Bench speech discrimination in noise performance of healthcare workers was tested under simulated background noise conditions from a variety of hospital environments. Candidates were assessed for ability to interpret speech with and without personal protective equipment, with both normal speech and raised voice.
Results
There was a significant difference in speech discrimination scores between normal and personal protective equipment wearing subjects in operating theatre simulated background noise levels (70 dB).
Conclusion
Wearing personal protective equipment can impact communication in healthcare environments. Efforts should be made to remind staff about this burden and to seek alternative communication paradigms, particularly in operating theatre environments.
Cold dissection is the most commonly used tonsillectomy technique, with low post-operative haemorrhage rates. Coblation is an alternative technique that may cause less pain, but could have higher post-operative haemorrhage rates.
Objective
This study evaluated the peri-operative outcomes in paediatric tonsillectomy patients by comparing coblation and cold dissection techniques.
Methods
A systematic review was conducted of all comparative studies of paediatric coblation and cold dissection tonsillectomy, up to December 2018. Any studies with adults were excluded. Outcomes such as pain, operative time, and intra-operative, primary and secondary haemorrhages were recorded.
Results
Seven studies contributed to the summative outcome. Coblation tonsillectomy appeared to result in less pain, less intra-operative blood loss (p < 0.01) and a shorter operative time (p < 0.01). There was no significant difference between the two groups for post-operative haemorrhage (p > 0.05).
Conclusion
The coblation tonsillectomy technique may offer better peri-operative outcomes when compared to cold dissection, and should therefore be offered in paediatric cases, before cold dissection tonsillectomy.
A 2-yr field study was conducted to determine effects of posttreatment irrigation timing on pendimethalin efficacy and dissipation in turfgrass. Factors investigated included herbicide rate, formulation, and the interval between pendimethalin application and the initial posttreatment irrigation. Plots received an initial posttreatment irrigation of 1.25 cm 0, 7, 14, 21, and 28 d after treatment. Pendimethalin efficacy on smooth crabgrass was evaluated, and turfgrass foliage and the upper 2.5-cm layer of soil were periodically assayed for pendimethalin residues. Pendimethalin 1.71% granular provided better weed control than pendimethalin 60% wettable powder at all rates, irrigation events, and years. Efficacy of granular pendimethalin was not affected by a delay in posttreatment irrigation, whereas efficacy of pendimethalin in the wettable powder formulation was reduced when irrigation was applied later than the day of treatment. Chromatographic analyses indicated that an average of 54% of the applied pendimethalin (wettable powder formulation) was retained on turfgrass foliage immediately after treatment, compared to 9% for the granular formulation. Soil residue analyses confirmed that a greater proportion of applied pendimethalin reached the soil surface immediately after treatment in the granular formulation than in the wettable power formulation.
A study was conducted in 1994 and 1995 at two Mississippi locations to evaluate preplant incorporated (PPI) and preemergence (PRE) applications of alachlor, clomazone, SAN 582, metolachlor, pendimethalin, and trifluralin, and postemergence (POST) applications of AC 263,222 and imazethapyr alone or followed by clethodim late postemergence (LPOST) for red rice control in soybean. Applications of 110 g ai/ha clethodim increased red rice control when following any earlier herbicide application at one location that harbored a high natural infestation. In 1 yr at one location, red rice seedhead suppression from PPI and PRE herbicide applications alone was greater than 95% due to high activity from herbicides and drought conditions during red rice seedhead development. Early postemergence (EPOST) applications of 30 g ae/ha AC 263,222 suppressed at least 95% of red rice seedheads, regardless of year, location, or clethodim LPOST application. At one location, any treatment where 110 g/ha clethodim followed an earlier herbicide application suppressed red rice seedheads at least 95%. Compared to the nontreated control, only AC 263,222 injured soybean (30%) and reduced soybean yield (200 kg/ha).
Glyphosate was evaluated at 0.8, 1.3, and 1.7 kg ae/ha applied at the two-leaf, four-leaf, or two- to three-tiller growth stage for red rice control. In addition, red rice seedheads were counted concurrently with soybean harvest at each of three locations to assess treatment effect on seedhead reduction. Field studies were conducted at Starkville, MS, in 1994 and 1995 and Shaw, MS, in 1995. A significant rate response was not observed for red rice control 2 and 4 wk after treatment (WAT) or for seedhead reductions. Glyphosate controlled red rice 88, 91, and 88% 2 WAT when applied to two-leaf, four-leaf, or two- to three-tiller red rice, respectively. Due to subsequent seedling emergence, control from glyphosate applications to two- or four-leaf red rice 4 WAT was 51 and 84%, respectively. Red rice treated at the two- to three-tiller stage was controlled 91% 4 WAT. When compared to the nontreated control, seedheads were reduced 97% by two- to three-tiller applications, compared to 87 and 56% reductions from four- and two-leaf applications, respectively.
Experiments were conducted from 1973 through 1975 on Lucedale sandy loam to determine the influence of in-row cotton (Gossypium hirsutum L. ‘Stoneville 213’) densities on the competitiveness of low-level infestations of sicklepod (Cassia obtusifolia L.) and pigweed (Amaranthus spp.). Weeds were established at densities of 0, 4, 12, and 32 weeds per 15 m of row and allowed to compete the entire season with cotton grown at densities of 5, 10, or 20 plants/m of row corresponding to 47000, 94000 and 187000 cotton plants/ha. Conventional cultural practices were employed in these experiments. Cotton yields were inversely related to weed density; however, the density of cotton did not influence the competitive effect of sicklepod or pigweed. Pigweed or sicklepod dry weed weight was reduced when competing with 187000 cotton plants/ha.
Influence of time of planting and distance from the cotton row of pitted morningglory (Ipomoea lacunosa L.), prickly sida (Sida spinosa L.), and redroot pigweed (Amaranthus retroflexus L.) on yield of seed cotton (Gossypium hirsutum L. ‘Stoneville 213’) was determined on Decatur clay loam during 1975 through 1978. Weed growth was measured in 1977 and 1978. Seeds of the three weed species were planted 15, 30, or 45 cm from the cotton row at time of planting cotton or 4 weeks later. Weeds planted 4 weeks after planting cotton grew significantly less than did weeds planted at the same time as cotton. When planted with cotton, redroot pigweed produced over twice as much fresh weight as did prickly sida or pitted morningglory. The distance that weeds were planted from the cotton row did not affect weed growth in 1978, but did in 1977. The distance that weeds were planted from the cotton row did not affect their competitiveness in any year as measured by yield of cotton. However, in each year, yields of cotton were reduced to a greater extent by weeds planted with cotton than when planted 4 weeks later. In 3 of 4 yr, there were significant differences in competitiveness of each of the three weed species with cotton.
Cotton (Gossypium hirsutum L. ‘Stoneville 213’) was grown with densities of sicklepod (Cassia obtusifolia L.) or redroot pigweed (Amaranthus retroflexus L.) ranging from 0 to 32 weeds/15 m of row. Regression of seed cotton yields on weed density revealed a linear decrease in yield with increasing weed densities. In the 3 yr these studies were conducted, losses in hand harvested yields of seed cotton ranged from 34 to 43 kg/ha for each sickledpod plant/15 m of row and 21 to 38 kg/ha for each redroot pigweed plant per 15 m of row. Under comparable weed densities, yields of seed cotton differed only slightly when hand harvested or mechanically harvested. Mechanical harvesting efficiencies of cotton were reduced only at higher densities of weeds. The percentage of trash in cotton generally increased with increasing density of weeds. Neither sicklepod nor redroot pigweed affected cotton grade or micronaire.