We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Coping-Together is a self-directed, self-management intervention initially developed for patients in early-stages of cancer and their caregivers. This study evaluated its acceptability among patients with advanced cancer and their caregivers.
Methods
Twenty-six participants (patients with advanced cancer n = 15 and their caregivers n = 11) were given the Coping-Together materials (6 booklets and a workbook) for 7 weeks. Participants were interviewed twice during this time to solicit feedback on the intervention’s content, design, and recommended changes. Audio-recorded interviews were transcribed verbatim, and thematic analysis was conducted.
Results
Participants found Coping-Together was mostly relevant. All (n = 26, 100%) participants expressed interest and a desire to improve their self-management skills. Perceived benefits included learning to develop SMARTTER (specific, measurable, attainable, relevant, timely, and done together) self-management plans, normalizing challenges, and enhancing communication within the dyad and with their healthcare team. Most (n = 25, 96%) identified strategies from the booklets that benefited them. Top strategies learned were skills to manage physical health (n = 20, 77%) (e.g., monitoring symptoms), emotional well-being (n = 21, 81%) (e.g., reducing stress by reframing thoughts), as well as social well-being (n = 24, 92%) (e.g., communicating with their healthcare team). Barriers included illness severity and time constraints. The unique advanced cancer needs that are to be integrated include support related to fear of death, uncertainty, palliative care and advanced care planning. Suggested modifications involved enhancing accessibility and including more advanced cancer information (e.g., end-of-life planning, comfort care, resources).
Significance of results
Participants reported several benefits from using Coping-Together, with minimal adaptations needed. Creating SMARTTER self-management plans helped them implement self-management strategies. Specific areas for improvement addressed the need for improved accessibility and more content related to advanced cancer. Findings demonstrate how Coping-Together is acceptable for those living with advanced cancer and their caregivers, offering much of the support needed to enhance day-to-day quality of life.
In 2018 and 2019, China’s outbreak of African swine fever (ASF) and the U.S.–China trade war captured media headlines worldwide. This research uses a unique data set of media headlines and sentiments to estimate the impact of media on U.S. lean hog futures prices for nearby and distant expiration contracts. Findings suggest futures prices are influenced by news media content, with results differing by time to contract expiration and sentiment of the headline. International headlines with positive and negative connotations toward ASF and trade war have more significant effects, indicating sensationalist media creates the greatest price movements compared to neutral headlines.
Diet is implicated in the development of Inflammatory Bowel Disease (IBD). However, the role of diet in reducing inflammation and managing prevalent disease is unclear (1–3). Previous studies have analysed the relationship between dietary patterns and occurrence of flares or symptoms, but not disease activity or inflammation (4–5). It is important to explore the role of habitual diet in management of IBD to provide targeted dietary recommendations. We explored the relationship between dietary intake with disease activity and inflammation in an Australian adult cohort with and without IBD. We analysed dietary and clinical data from the Australian IBD Microbiome (AIM) study. AIM is a prospective longitudinal cohort study of adults and children with Crohn’s Disease (CD), Ulcerative colitis (UC) and healthy controls (HC). Habitual dietary intake of food groups, fibre, polyphenols and fermented foods was collected by merging dietary data from 3-day food records and food frequency questionnaires with PhenolExplorer and the Australian Fibre Categories Database. Dietary patterns were explored using Principal Component Analysis (PCA) and cluster analysis (CA) in IBM SPSS Statistics (V29). Associations between dietary intake, clinical disease activity categorised as remission or active, and faecal calprotectin (FCAL) were explored in adult participants. A total of 412 participants (IBD = 223, HC = 189) were included. FCAL data was available for 211 participants (HC = 100, CD = 49, UC = 62). Median (IQR) FCAL at baseline was 20 (20) mg/kg for HC and 33 (127) mg/kg for IBD, indicating clinically irrelevant inflammation (FCAL >50mg/kg = clinical inflammation). PCA identified 7 distinct dietary patterns for adults with IBD. A dietary pattern of high plant diversity was associated with active CD. In the total IBD cohort, low association to a 'Prudent’ pattern was positively associated with low FCAL, and high association to a 'Meat-eaters’ dietary pattern was positively associated with moderate FCAL. CA revealed 3 distinct clusters amongst participants with IBD. No significant difference between diet cluster and disease activity or FCAL was seen. There were no significant differences in intake of fibre or polyphenols between remission vs active disease in participants with IBD. A significant difference between total, soluble and insoluble fibre and FCAL categories was seen with a higher fibre intake associated with lower FCAL. Higher plant-diversity and 'Prudent’ dietary patterns are associated with active disease and higher FCAL in Australian adults with IBD. Reverse causality cannot be ruled out, with analysis of larger cohorts and clinical trial data needed to clarify this.
Data from a national survey of 348 U.S. sports field managers were used to examine the effects of participation in Cooperative Extension events on the adoption of turfgrass weed management practices. Of the respondents, 94% had attended at least one event in the previous 3 yr. Of this 94%, 97% reported adopting at least one practice as a result of knowledge gained at an Extension turfgrass event. Half of the respondents had adopted four or more practices; a third adopted five or more practices. Nonchemical, cultural practices were the most-adopted practices (65% of respondents). Multiple regression analysis was used to examine factors explaining practice adoption and Extension event attendance. Compared to attending one event, attending three events increased total adoption by an average of one practice. Attending four or more events increased total adoption by two practices. Attending four or more events (compared to one event) increased the odds of adopting six individual practices by 3- to 6-fold, depending on the practice. This suggests that practice adoption could be enhanced by encouraging repeat attendance among past Extension event attendees. Manager experience was a statistically significant predictor of the number of Extension events attended but a poor direct predictor of practice adoption. Experience does not appear to increase adoption directly, but indirectly, via its impact on Extension event attendance. In addition to questions about weed management generally, the survey asked questions specifically about annual bluegrass management. Respondents were asked to rank seven sources of information for their helpfulness in managing annual bluegrass. There was no single dominant information source, but Extension was ranked more than any other source as the most helpful (by 22% of the respondents) and was ranked among the top three by 53%, closely behind field representative/local distributor sources at 54%.
This study determines which factors are associated with the use of rotational grazing and the frequency with which Tennessee producers rotate cattle during the summer months. Survey data were used to estimate an ordered response model with sample selection. Most respondents used rotational grazing, and the most frequent rotational schedule was rotating cattle one to two times per month. Factors including labor, capital, knowledge, and water availability influenced the use of rotational grazing and the frequency of rotating cattle. The insights from this study can inform the development of incentives to promote more intensive use of rotational grazing.
Fluting is a technological and morphological hallmark of some of the most iconic North American Paleoindian stone points. Through decades of detailed artifact analyses and replication experiments, archaeologists have spent considerable effort reconstructing how flute removals were achieved, and they have explored possible explanations of why fluting was such an important aspect of early point technologies. However, the end of fluting has been less thoroughly researched. In southern North America, fluting is recognized as a diagnostic characteristic of Clovis points dating to approximately 13,000 cal yr BP, the earliest widespread use of fluting. One thousand years later, fluting occurs more variably in Dalton and is no longer useful as a diagnostic indicator. How did fluting change, and why did point makers eventually abandon fluting? In this article, we use traditional 2D measurements, geometric morphometric (GM) analysis of 3D models, and 2D GM of flute cross sections to compare Clovis and Dalton point flute and basal morphologies. The significant differences observed show that fluting in Clovis was highly standardized, suggesting that fluting may have functioned to improve projectile durability. Because Dalton points were used increasingly as knives and other types of tools, maximizing projectile functionality became less important. We propose that fluting in Dalton is a vestigial technological trait retained beyond its original functional usefulness.
Arctic rabies virus variant (ARVV) is enzootic in Quebec (Canada) north of the 55th parallel. With climate change, increased risk of re-incursion of ARVV in more densely populated southern regions raises public and animal health concerns. The objective of this study was to prioritise geographical areas to target for an early detection of ARVV incursion south of the 55th parallel based on the historical spatio-temporal trends of reported rabies in foxes in Quebec. Descriptive analyses of fox rabies cases from 1953 to 2017 were conducted. Three periods show increases in the number of fox rabies cases in southern regions and indicate incursion from northern areas or neighbouring provinces. The available data, particularly in central and northern regions of the province, were scarce and of low spatial resolution, making it impossible to identify the path of spread with precision. Hence, we investigated the use of multiple criteria, such as historical rabies cases, human population density and red fox (Vulpes vulpes) relative abundance, to prioritise areas for enhanced surveillance. This study underscores the need to define and maintain new criteria for selecting samples to be analysed in order to detect rapidly ARVV cases outside the current enzootic area and any potential re-incursion of the virus into central and southern regions of the province.
This article emerged as the human species collectively have been experiencing the worst global pandemic in a century. With a long view of the ecological, economic, social, and political factors that promote the emergence and spread of infectious disease, archaeologists are well positioned to examine the antecedents of the present crisis. In this article, we bring together a variety of perspectives on the issues surrounding the emergence, spread, and effects of disease in both the Americas and Afro-Eurasian contexts. Recognizing that human populations most severely impacted by COVID-19 are typically descendants of marginalized groups, we investigate pre- and postcontact disease vectors among Indigenous and Black communities in North America, outlining the systemic impacts of diseases and the conditions that exacerbate their spread. We look at how material culture both reflects and changes as a result of social transformations brought about by disease, the insights that paleopathology provides about the ancient human condition, and the impacts of ancient globalization on the spread of disease worldwide. By understanding the differential effects of past epidemics on diverse communities and contributing to more equitable sociopolitical agendas, archaeology can play a key role in helping to pursue a more just future.
To assess the prevalence and correlates of childhood and adolescence sexual and/or physical abuse (SPA) in bipolar I disorder (BD) patients treated for a first episode of psychotic mania.
Methods
The Early Psychosis Prevention and Intervention Centre (EPPIC) admitted 786 first episode psychosis (FEP) patients between 1998 and 2000. Data were collected from patients’ files using a standardized questionnaire. 704 files were available, 43 were excluded because of a non-psychotic diagnosis at endpoint and 3 due to missing data regarding past stressful events. Among 658 patients with available data, 118 received a final diagnosis of BD and were entered in this study.
Results
80% of patients had been exposed to stressful life events during childhood and adolescence and 24.9% to SPA: in particular, 29.8% of female patients had been exposed to sexual abuse. Patients who were exposed to SPA had poorer pre-morbid functioning, higher rates of forensic history, were less likely to live with family during treatment period and were more likely to disengage from treatment
Conclusions
Sexual and/or physical abuse is highly prevalent in BD patients presenting with a first episode of psychotic mania; exposed patients have lower pre-morbid functional levels and poorer engagement with treatment. The context in which such traumas occur must be explored in order to define if early intervention strategies may contribute to diminish their prevalence. Specific psychological interventions must also be developed.
There is widespread evidence that schizophrenic symptomatology is best represented by three syndromes (positive, negative, disorganized). Both the disorganized and negative syndrome have been found to correlate with several neurocognitive dysfunctions. However, previous studies investigated samples predominantly treated with typical neuroleptics, which frequently induce parkinsonian symptoms that are hard to disentangle from primary negative symptoms and may have inflated correlations with neurocognition. A newly developed psychopathological instrument called the Positive and Negative and Disorganized Symptoms Scale (PANADSS) was evaluated in 60 schizophrenic patients. Forty-seven participants treated with atypical neuroleptics performed several neurocognitive tasks.
A three-factor solution of schizophrenic symptomatology emerged. Negative symptomatology was associated with diminished creative verbal fluency and digit span backward, whereas disorganization was significantly correlated with impaired Stroop, WCST and Trail-Making Test B performance.
Data suggest that disorganization is associated with tasks that demand executive functioning. Previous findings reporting correlations between negative symptomatology and neurocognition may have been confounded by the adverse consequences of typical neuroleptics.
(1) determine which antipsychotic side effects (SE) schizophrenic patients consider the most distressing during treatment with typical antipsychotics, (2) measure the impact of actual and past SE on patients' attitude toward antipsychotics and (3) assess the influence of both on adherence.
Methods
The 213 schizophrenics, treated with conventional antipsychotics, were recruited in two psychiatric hospitals in Hamburg. Subjects were assessed about type and severity of present and past side effects and their attitude and adherence to antipsychotic treatment.
Results
The 82 (39%) patients presented present SE while 131 (61%) did not. Sexual dysfunctions (P<0.001), extrapyramidal (P<0.05) and psychic side effects (P<0.05) were rated as significantly subjectively more distressing than sedation or vegetative side effects. Patients presenting with present SE compared with patients without present SE had a significantly more negative general attitude toward antipsychotics (P<0.05), were more doubtful about their efficacy (P<0.01) and were less likely to encourage a relative to take such a medication in case of need (P<0.001). A regression analysis indicated that nonadherence was mainly influenced by negative general and efficacy attitudes toward antipsychotics and the experience of past or present antipsychotic side effects.
Conclusions
All antipsychotic side effects, present or past, can have a durable negative impact on patient's attitude toward antipsychotic treatment and adherence. Non-adherence is mainly determined, among other factors, by these negative attitudes, which are partly influenced by the experience of past or present antipsychotic-induced side effects.
Cognitive impairments in schizophrenics have been found to precede tardive dyskinesia and to co-exist with other motor deficits. However, little is yet known about the prevalence of cognitive disturbances in patients with neuroleptic-induced parkinsonism. From the literature on idiopathic parkinson, it was inferred that extrapyramidal symptoms (EPS) are accompanied by cognitive dysfunction. 85 schizophrenic in-patients were divided into EPS high and low scorers according to an established criterion (Simpson Angus Scale, cut-off score: 0.4). Cognitive impairments were assessed using a self-rating instrument measuring disturbances of information processing.
Patients with high EPS exhibited significantly elevated scores in six of ten cognitive and perceptual subscales (t = 2.1—3.1) as compared to low EPS patients. It is concluded that high EPS patients suffer from cognitive disturbances which are assumed to possess high relevance for both psycho-social and medical treatment. Cognitive problems may, when not considered, disturb compliance, insight of illness and transfer of learnt skills into everyday life.
To measure symptomatic and functional remission in patients treated with risperidone long-acting injectable (RLAI).
Methods
Stable patients with psychotic disorders requiring medication change were switched to open-label RLAI in the switch to risperidone microspheres (StoRMi) trial. In this post-hoc analysis of the trial extension, follow-up was ≤18 months. Symptomatic remission was based on improvement in positive and negative syndrome scale (PANSS) scores and global remission (best outcome) was based on symptomatic remission, functional level, and mental-health quality of life. Predictive factors were evaluated.
Results
Among 529 patients from seven European countries, mean participation duration was 358.7 ± 232.4 days, with 18 months completed by 39.9% of patients. Symptomatic remission lasting ≥6 months occurred at some point during treatment in 33% of patients; predictors included comorbid disease, country, baseline symptom severity, baseline functioning, type of antipsychotic before switching, and duration of untreated psychosis. Best outcome occurred in 21% of patients; predictors included baseline symptom severity, baseline functioning, country, schizophrenia type, and early positive treatment course.
Conclusions
One in three patients with stable schizophrenia switching to RLAI experienced symptomatic remission, with combined symptomatic, functional, and quality-of-life remission in one in five patients. Symptomatic remission was predicted by better baseline symptom severity and country of origin, with a significantly greater likelihood of remission occurring among patients in Estonia/Slovenia compared with Portugal. Relapse was predicted by higher mode doses of RLAI, additional use of psychoactive medications, male gender, and country of origin, with relapse occurring most frequently in France and least frequently in Portugal. RLAI dose, additional use of psychoactive medications, and country of origin predicted best outcome, with best outcome occurring most frequently in Estonia/Slovenia and least frequently in Portugal.
Studies indicate that patient-rated outcomes and symptomatic remission as defined by the remission in schizophrenia working group rely on different assumptions. The aim of this observational study was to assess symptomatic remission by patients with schizophrenia, family members and psychiatrists and to compare their assessments with standardized criteria and clinical measures.
Methods
One hundred and thirty-one patients with schizophrenia (DSM-IV), family members and psychiatrists assessed remission within the European Group on Functional Outcomes and Remission in Schizophrenia (EGOFORS) project. Symptoms (Positive and Negative Syndrome Scale [PANSS]), functional outcome (Functional Recovery Scale in Schizophrenia [FROGS]), subjective well-being (SWN-K) and demographic characteristics were investigated.
Results
Remission assessed by psychiatrists showed the best accordance with standardized remission (80%), followed by remission assessed by family members (52%) and patients (43%). Only in 18%, patients, relatives and psychiatrists agreed in their assessments. Good subjective well-being was most important for remission estimated by patients, good subjective well-being and symptom reduction by family members, and finally better symptom scores, well-being and functioning by psychiatrists.
Discussion
Self- and expert-rated clinical outcomes differ markedly, with a preference on the patients’ side for subjective outcome. Symptomatic remission as assessed by the standardized criteria plays a secondary role for patients and relatives in daily clinical practice. A more thorough consideration of patients’ and caregivers’ perspectives should supplement the experts’ assessment.
Studies reported close associations between functional outcome and symptomatic remission as defined by the Remission in Schizophrenia Working Group. This observational study was aimed at the investigation of deficits in daily functioning, symptoms and subjective well-being in remitted and non-remitted patients with schizophrenia.
Methods
Symptoms (PANSS), functional outcome (FROGS, GAF), subjective well-being (SWN-K) and other characteristics were assessed in 131 patients with schizophrenia (DSM-IV) within the European Group on Functional Outcomes and Remission in Schizophrenia (EGOFORS) project.
Results
A significant better level of functioning was measured for remitted versus non-remitted patients, though remitted patients still showed areas with an inadequate level of functioning. Functional deficits were most often seen in social relations (40%), work (29%) and daily life activities (17%). Best functioning was assessed for self-care, self-control, health management and medical treatment. A moderate to severe level of disorganization and emotional distress was observed in 38% and impaired subjective well-being in 29% of patients defined as being in symptomatic remission.
Discussion
The results confirm a close association between symptomatic remission and functional outcome. However, deficits in different areas of functioning, symptoms and well-being underline the need for combined outcome criteria for patients with schizophrenia.
To review the management of temporal bone fractures at a major trauma centre and introduce an evidence-based protocol.
Methods
A review of reports of head computed tomography performed for trauma from January 2012 to July 2018 was conducted. Recorded data fields included: mode of trauma, patient age, associated intracranial injury, mortality, temporal bone fracture pattern, symptoms and intervention.
Results
Of 815 temporal bone fracture cases, records for 165 patients met the inclusion criteria; detailed analysis was performed on the records of these patients.
Conclusion
Temporal bone fractures represent high-energy trauma. Initial management focuses on stabilisation of the patient and treatment of associated intracranial injury. Acute ENT intervention is directed towards the management of facial palsy and cerebrospinal fluid leak, and often requires multidisciplinary team input. The role of nerve conduction assessment for immediate facial palsy is variable across the UK. The administration of high-dose steroids in patients with temporal bone fracture and intracranial injury is not advised. A robust evidence-based approach is introduced for the management of significant ENT complications associated with temporal bone fractures.
Research on producer willingness to adopt individual best pasture management practices (BMPs) is extensive, but less attention has been paid to producers simultaneously adopting multiple, complementary BMPs. Applications linking primary survey data on BMP adoption to water quality biophysical models are also limited. A choice-experiment survey of livestock producers is analyzed to determine willingness to adopt pasture BMPs. Sediment abatement curves are derived by linking estimates of producer responsiveness to incentives to adopt rotational grazing with a biophysical simulation model. Current cost share rates of $24/acre should yield a 12% decrease in sediment loading from pastures.
We determined how pasture and grazing management practices affected the number of days hay was fed to cattle by season. Data were collected from a survey of Tennessee cattle producers. Days of cattle on hay varied across seasons because of variations in forage production and weather. The number of days hay was fed to cattle varied with pasture-animal management practices such as rotating pastures, forage mixtures, and weed management strategies. Having mixtures of cool- and warm-season grasses reduced the number of days on hay in the winter, spring, and summer months indicating benefits from diversified forages.
Introduction: Emergency Department (ED) consultations are often necessary for safe and effective patient care. Delays in throughput related to ED consultations can increase a patient's ED length of stay (LOS) and contribute to ED crowding. This review aimed to characterize and evaluate interventions to improve consultation metrics. Methods: Eight primary literature databases and the grey literature were comprehensively searched. Comparative studies of interventions to improve ED consultation metrics were included. Unique citations were screened for relevance and the full-texts of relevant articles were reviewed by two independent reviewers. Data on study characteristics and outcomes were extracted in duplicate onto standardized forms. Disagreements were resolved through consensus. Categorical variables are reported as proportions. Continuous variables are reported as the median of the means and total range. Results: After screening 2632 unique citations and 19 from the grey literature items, 24 studies were included. Seventeen interventions targeted specific conditions or speciality services, while the remainder targeted all ED presentations. Interventions fell into three broad categories: strategies to expedite patient care, including clinical pathways (42%); interventions to improve consultant responsiveness (33%); and addition of a specialized care team to the ED (25%). Overall, eight studies reported on the overall proportion of consults in the ED, of which six reported an increase in the consultation proportion (median: +0.6%, range: −11.3% to +49.6%). Six studies reported the proportion of consulted patients who were admitted, of which four reported an increase (median: +1.1%, range: −5.9% to +3.5%). On the other hand, six of seven studies reporting on time from request to consult arrival reported a decrease (median: −25 minutes, range: −66 to +3.8 minutes). Similarly, overall ED LOS was reported to be lower in 17/19 studies reporting this metric (median: −47.6 minutes, range: −600 minutes to +59 minutes). Conclusion: A variety of strategies have been employed to improve ED consultation processes and outcomes. Neither the proportion of consulted patients in the ED nor the proportion of admissions were improved; however, interventions appeared successful at improving consultant arrival times and overall ED LOS. Improvements in consultation processes may be an effective strategy to improve ED throughput and thereby reduce ED crowding.