We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objectives/Goals: To facilitate engagement between university researchers and Appalachian Kentucky communities, the UK Rural Research Hub (RRH) promotes Community Engaged Research (CEnR) and academic–community partnerships that have the greatest potential to conduct impactful research to improve health and reduce regional health disparities. Methods/Study Population: Through the UK RRH, a wealth of expertise and a range of services sustain successful CEnR. Hub coordinators provide research consultations, accelerate researchers’ engagement with the community, and facilitate the success of studies through study coordination, assistance with participant recruitment, data collection and interventions, and through dissemination back to the community. Results/Anticipated Results: UK RRH coordinators have supported numerous studies across the region. For example, RRH staff facilitated recruitment of and collected data from 40 Appalachian caregivers of patients with Alzheimer’s disease and related dementias (ADRD) in a study to improve home environments for patient well-being. The study provided pilot data for a successful K23 application. Other examples of supported research include studies to improve cancer screening uptake, self-management of diabetes, and cardiovascular disease risk reduction, resulting in improved care in the community and often providing pilot data leading to larger national grants. Discussion/Significance of Impact: Research addressing the complex health issues that burden Appalachian Kentucky requires community engagement to be successful. The UK RRH is at the heart of successful CEnR that benefits researchers and communities alike.
Objectives: Activities that require active thinking, like occupations, may influence cognitive function and its change over time. Associations between retirement and dementia risk have been reported, however the role of retirement age in these associations is unclear. We assessed associations of occupation and retirement age with cognitive decline in the US community-based Atherosclerosis Risk in Communities (ARIC)cohort.
Methods: We included 14,090 ARIC participants, followed for changes in cognition during up to 21 years. Information on current or most recent occupation was collected at ARIC baseline (1987–1989; participants aged 45–64 years) and categorized according to the 1980 US Census protocols and the Nam-Powers-Boyd occupational status score. Follow-up data on retirement was collected during 1999–2007 and classified as retired versus not retired at age 70. Trajectories of global cognitive factor scores from ARIC visit 2 (1990–1992) to visit 5 (2011–2013) were presented, and associations with occupation and age at retirement were studied using generalized estimating equation models, stratified by race and sex, and adjusted for demographics andcomorbidities.
Results: Mean age (SD) at first cognitive assessment was 57.0 (5.72) years. Higher occupational status and white- collar occupations were significantly associated with higher cognitive function at baseline. Occupation was associated with cognitive decline over 21 years only in women, and the direction of the effect on cognitive function differed between black and white women: in white women, the decline in cognitive function was greater in homemakers and low status occupations, whereas in black women, less decline was found in homemakers and low (compared to high) occupational status. Interestingly, retirement on or before age 70 was associated with less 21-year cognitive decline in all race-sex strata, except for blackwomen.
Conclusions: Associations between occupation, retirement age and cognitive function substantially differed by race and sex. Further research should explore reasons for the observed associations and race-sex differences.
Knowledge of Ascophyllum nodosum extracts (ANEs) is still limited to avocado ‘Hass’ in the tropics. The objective of this study was to evaluate the effects of two ANEs application methods (foliar v. drench) at four different doses (0, 2.5, 5 and 7.5 ml/l) on the physiological response of three different avocado stages (seedlings and young and adult trees). Foliar or drench ANEs applications were performed monthly for all plants for 16 weeks. The evaluated variables were recorded at 4 and 20 weeks after the start of treatment (WAT). The results showed that ANEs can be applied to the drench or foliar method at doses ≥5 ml/l in the different growth stages evaluated. In seedlings, foliar or drench ANEs applications increased total dry weight (34.5 and 57.9 g for 0 and ≥5 ml/l, respectively) and stomatal conductance (gs) (380 and 205 mmol/m2s for 0 and ≥5 ml/l, respectively) at 20 WAT. In young trees, both application methods also improved growing index (88.6 and 102 cm for 0 and ≥5 ml/l, respectively) and gs (516 and 636 mmol/m2s for 0 and ≥5 ml/l, respectively) at the last sampling point. Adult trees showed that foliar or drench applications at higher doses also caused an increase in fruit yield (3.4 and 8.7 kg/tree for 0 and ≥5 ml/l, respectively) at 20 WAT. In conclusion, the use of foliar and soil ANEs applications at higher doses (≥5 ml/l) can be considered for integrated crop management of ‘Hass’ avocado.
Background: Stereotactic laser amygdalohippocampotomy (SLAH) has recently been shown to be comparable to traditional temporal lobectomy procedures. The ideal extent and volume of laser ablations remains an area of investigation Methods: 65 patients treated with SLAH for MTS were considered in this retrospective study. Manual segmentations of ablations were created using post-procedure T1-MRI scans. Ablations were assessed in relation to whether they crossed the coronal plane of the superior lateral mesencephalic sulcus (LMS), the extent to which ablation crossed this landmark, and extent of ablation of the uncus. Analysis of was done with binary categorization of 12-month Engel classification score. Results: Distance of ablation posterior to the coronal plane of the LMS was not associated with better surgical outcome (Engel class 1: 6.32 ± 4.16 mm; Engel class 2-4: 7.93 ± 3.75mm; (p = 0.099)). Ratio of ablations extending posterior to the LMS was 0.82 (SD = .39) in Engel 1 patients, and 0.90 (SD = 0.3) in Engel 2-4 patients; (p = 0.370). Volume of ablation showed little correlation with outcome (Engel class 1: 6064 ± 2128 mm3; Engel class 2-4: 5828 ± 3031 mm3; (p=0.239)). Ablation of the uncus showed a strong association with better surgical outcome (Engel class 1: 0.71(SD = 0.31); Engel 2-4: 0.37 (SD = 0.36); p <0.001). Conclusions: Contrary to current practice, extension of ablation posterior to the LMS did not demonstrate improved outcome.
Dentists prescribe 10% of all outpatient antibiotics in the United States and are the top specialty prescriber. Data on current antibiotic prescribing trends are scarce. Therefore, we evaluated trends in antibiotic prescribing rates by dentists, and we further assessed whether these trends differed by agent, specialty, and by patient characteristics.
Design:
Retrospective study of dental antibiotic prescribing included data from the IQVIA Longitudinal Prescription Data set from January 1, 2012 to December 31, 2019.
Methods:
The change in the dentist prescribing rate and mean days’ supply were evaluated using linear regression models.
Results:
Dentists wrote >216 million antibiotic prescriptions between 2012 and 2019. The annual dental antibiotic prescribing rate remained steady over time (P = .5915). However, the dental prescribing rate (antibiotic prescriptions per 1,000 dentists) increased in the Northeast (by 1,313 antibiotics per 1,000 dentists per year), among oral and maxillofacial surgeons (n = 13,054), prosthodontists (n = 2,381), endodontists (n = 2,255), periodontists (n = 1,961), and for amoxicillin (n = 2,562; P < .04 for all). The mean days’ supply significantly decreased over the study period by 0.023 days per 1,000 dentists per year (P < .001).
Conclusions:
From 2012 to 2019, dental prescribing rates for antibiotics remained unchanged, despite decreases in antibiotic prescribing nationally and changes in guidelines during the study period. However, mean days’ supply decreased over time. Dental specialties, such as oral and maxillofacial surgeons, had the highest prescribing rate with increases over time. Antibiotic stewardship efforts to improve unnecessary prescribing by dentists and targeting dental specialists may decrease overall antibiotic prescribing rates by dentists.
To determine whether a structured OPAT program supervised by an infectious disease physician and led by an OPAT nurse decreased hospital readmission rates and OPAT-related complications and whether it affected clinical cure. We also evaluated predictors of readmission while receiving OPAT.
Patients:
A convenience sample of 428 patients admitted to a tertiary-care hospital in Chicago, Illinois, with infections requiring intravenous antibiotic therapy after hospital discharge.
Methods:
In this retrospective, quasi-experimental study, we compared patients discharged on intravenous antimicrobials from an OPAT program before and after implementation of a structured ID physician and nurse-led OPAT program. The preintervention group consisted of patients discharged on OPAT managed by individual physicians without central program oversight or nurse care coordination. All-cause and OPAT-related readmissions were compared using the χ2 test. Factors associated with readmission for OPAT-related problems at a significance level of P < .10 in univariate analysis were eligible for testing in a forward, stepwise, multinomial, logistic regression to identify independent predictors of readmission.
Results:
In total, 428 patients were included in the study. Unplanned OPAT-related hospital readmissions decreased significantly after implementation of the structured OPAT program (17.8% vs 7%; P = .003). OPAT-related readmission reasons included infection recurrence or progression (53%), adverse drug reaction (26%), or line-associated issues (21%). Independent predictors of hospital readmission due to OPAT-related events included vancomycin administration and longer length of outpatient therapy. Clinical cure increased from 69.8% before the intervention to 94.9% after the intervention (P < .001).
Conclusion:
A structured ID physician and nurse-led OPAT program was associated with a decrease in OPAT-related readmissions and improved clinical cure.
The aim of this study was to assess the feasibility and test-retest reliability of the Welfare Quality® Animal Welfare Assessment Protocol for Growing Pigs. Twenty-three German pig farms were visited repeatedly by the same trained observers; each farm being visited six times during two fattening periods. The entire protocol assessment was carried out during each farm visit, ie a Qualitative Behaviour Assessment (QBA), behavioural observations (BO), a Human Animal Relationship test (HAR) and different individual parameters (IPs), eg bursitis and tail-biting. Test-retest reliability was evaluated by a Wilcoxon signed rank test (W) and by calculation of the Smallest Detectable Change (SDC) and Limits of Agreement (LoA). The QBA presented non-satisfactory agreement between farm visits. However, good agreement, in general, was found for the BO. For the HAR, no reliability could be detected. Most IPs were of acceptable agreement, with the exception of bursitis and manure on the body. Bursitis showed great differences, which can be explained by difficulties in the assessment when the animals moved around or their legs were dirty. The disagreement in the parameter manure on the body can be explained by seasonal effects. Disagreement was further found concerning the parameters coughing, sneezing, pleuritis, pneumonia and milkspots. Feasibility was good; both observers could be well-trained to fulfil the protocol. Furthermore, the time needed for an assessment did not exceed 6 h. The parts of the protocol that proved to be insufficiently reliable need to be addressed in the future in order to enhance and improve the objective measurement of animal welfare.
Loneliness, a negative emotion stemming from the perception of unmet social needs, is a major public health concern. Current interventions often target social domains but produce small effects and are not as effective as established emotion regulation (ER)-based interventions for general psychological distress (i.e., depression/anxiety). Given that loneliness and distress are types of negative affect, we aimed to compare them within an ER framework by examining the amount of variance ER strategies accounted for in loneliness versus distress, and comparing the ER strategy profiles characterising them. Participants (N = 582, Mage = 22.31, 77.66% female) completed self-report measures of loneliness, distress, and use of 12 cognitive (e.g., cognitive reappraisal) or behavioural (e.g., expressive suppression) ER strategies. Regression analyses revealed that ER explained comparable variance in these constructs. Latent profile analysis identified seven profiles differing in ER patterns, with no distinct loneliness or distress profile identified. Rather, similar patterns of ER characterised these two constructs, involving the greater use of generally maladaptive strategies and the lesser use of generally adaptive strategies. However, loneliness was additionally characterised by less use of strategies involving social connection/expression. Overall, our study supports the utility of ER for understanding loneliness. Established ER-based frameworks/interventions for distress may have transdiagnostic utility in targeting loneliness.
To identify characteristics of US health systems and end users that report antimicrobial use and resistance (AUR) data, to determine how NHSN AUR data are used by hospitals and health systems and end users, and to identify barriers to AUR reporting.
Design:
An anonymous survey was sent to Society of Infectious Diseases Pharmacists (SIDP) and Society for Healthcare Epidemiology of America (SHEA) Research Network members.
Methods:
Data were collected via Survey Monkey from January 21 to February 21, 2020. Respondent and hospital data were analyzed using descriptive statistics.
Results:
We received responses from 238 individuals across 43 US states. Respondents were primarily pharmacists (84%), from urban areas, (44%), from nonprofit medical centers (81%), and from hospitals with >250 beds (72%). Also, 62% reported data to the AU module and 19% reported data to the AR module. Use of software for local AU or AR tracking was associated with increased reporting to the AU module (19% vs 64%) and the AR module (2% vs 30%) (P < .001 each). Only 36% of those reporting data to the AU module used NHSN AUR data analysis tools regularly and only 9% reported data to the AR module regularly. Technical challenges and time and/or salary support were the most common barriers to AUR participation cited by all respondents. Among those not reporting AUR data, increased local expectations to report and better software solutions were the most commonly identified solutions to increase AUR reporting.
Conclusions:
Efforts to increase AUR reporting should focus on software solutions and salary support for data-entry activities. Increasing expectations to report may incentivize local resource allocation to improve AUR reporting rates.
The aim of the study was to analyse the influence on tail-biting in undocked pigs during the rearing period of crude fibre in piglets' rations. All pigs were fed the same pre-starter until weaning. The study comprised two trials with four experimental groups each. The first trial contained: a control group (CG1) with conventional feed (up to 40 g/kg crude fibre), two groups with an increased crude fibre content of up to 50 g/kg (G5) and 60 g/kg (G6), respectively, and one group with conventional feed and crude fibre provision ad libitum (AL). The second trial consisted of a control group (CG2) which received the same conventional feed as CG1 and three treatment groups with either soya hulls (SS), dried sugar beet pulp (DP) or oat fibre (OF) admixed to their ration, to achieve a crude fibre content of 60 g/kg in all three groups. The rearing week, the batch, the treatment group (only in trial one) and the interaction between batch and treatment group had a significant influence on tail-lesions (P < 0.05). The tail-biting process started in rearing week 3 (trial one) and 5 (trial two), respectively. Due to the low frequency of tail-biting during the present study, crude fibre seems to have no major influence on tail-biting during the rearing period. This unexpected result may be caused by the optimized conditions in which the piglets were kept and the intensive animal observation carried out by the employees. However, the batch effect was most influential.
People with mental illness are at a high suicide risk. About 5% of these suicides occur during psychiatric inpatient treatment. Few data are available on demographic and risk factors for this population. Therefore, we analysed all psychiatric inpatient suicides from 1992 – 2004 in a catchment area of about 1.2m population in Switzerland.
Methods
Charts review.
Results
We identified 142 patients who committed suicide while in the hospital wherefrom 125 charts could be reviewed. 52% were male. 52% were diagnosed with an affective and 26% with a psychotic disorder, respectively. 59% were admitted due to suicidal ideations. 58% had a history of suicide attempt(s). 74% reported serious life events previous to the index hospitalisation. 74% committed suicide outside the hospital. Most suicides occurred in month 3-6 after admission. In the last assessment before the suicide, 88% had affective symptoms, 66% anxiety, 63% hopelessness, 42% psychotic symptoms and 36% agitation/restlessness. Of those with affective symptoms, 79% received antidepressive medication. 77% with psychotic medication had antipsychotics and 42% of those with anxiety received anxiolytics. 64% denied in their last interview before committing suicide suicidal ideations, 42% had a “non-suicide agreement” with their clinicians. According to a clinical assessment, 80% of those who committed suicide were at low or at no suicide risk.
Conclusions
Most inpatients suicide occurred unexpectedly. A more rigorous treatment of anxiety, but also affective and antipsychotic symptoms could lead to decrease suicide in inpatient settings. “Non-suicide agreements” could not prevent suicides.
To address and better understand the problem of high suicide rates in widows and widowers.
Methods:
Sex and age specific suicide data collated by marital group were extracted from Swiss mortality statistics for the period 1991-2003. The mortality in the first week / month / year of widowhood was calculated based on person-year calculations.
Results:
Cross-sectional analysis by sex and age-group confirms the existence of different suicide rate patterns according to marital status. Moreover, the profiles of suicide methods differ. In particular, suicide methods which may be associated with impulsive suicides, such as firearms or poisoning, are relatively frequent in the widowed. The suicide risk of widowed persons is extremely high in the days and weeks immediately after bereavement.
Conclusions:
Suicide risk and suicide behavior varies systematically according to marital status. In particular, widows and widowers emerge as a group suitable for preventive methods because of the existence of a time window when there is increased risk. Moreover, widowed persons are a clear-cut risk group under the aegis of undertakers, priests and perhaps general practitioners.
Patients’expectancies have long been considered to contribute to treatment outcome. Whereas research has concentrated on different types of expectancies in predicting outcome, it has not examined their interactive contribution, therapist factors, nor the development of expectancies over time. Therefore, the present study aims to investigate the independent as well as the interactive contributions of outcome expectancies (OE) and negative mood regulation expectancies (NMRE) to outcome. One hundred and fourty depressed outpatients in cognitive-behavior psychotherapy completed measures of OE and NMRE at pretreatment and midtreatment, as well as outcome measures at midtreatment and posttreatment. Patients’ OE were assessed using the Patients’ Therapy Expectation and Evaluation Questionnaire (PATHEV; Schulte, 2005), and the short form of the Negative Mood Regulation Scale (NMR; Backenstrass et al., 2010). Outcome was measured using the German version of the Beck Depression Inventory – II (BDI-II; Hautzinger, Keller, & Kühner, 2006), and the Inventory of Depressive Symptomatology - Clinician Rated 30-item version (IDS-C; Rush, Carmody, & Reimitz, 2000). We will perform three-level multiple longitudinal hierarchical analysis, with different assessment time points as the first level, nested in patients (second level), which are nested in therapists (third level), controlling for comorbidities. We expect OE and NMRE to change significantly during therapy, and these changes to be related to outcome, both at midtreatment and posttreatment. We also expect to find a significant interaction between OE and NMRE in predicting outcome, as well as a significant influence of therapists on patients's expectancies. Theoretical and clinical implications of the results will be discussed.
Deficits of mismatch negativity (MMN) in schizophrenia and individuals at risk for psychosis have been replicated many times. Several studies have also demonstrated the occurrence of subclinical psychotic symptoms within the general population. However, none has yet investigated MMN in individuals from the general population who report subclinical psychotic symptoms.
Methods
The MMN to duration-, frequency-, and intensity deviants was recorded in 217 nonclinical individuals classified into a control group (n = 72) and three subclinical groups: paranoid (n = 44), psychotic (n = 51), and mixed paranoid-psychotic (n = 50). Amplitudes of MMN at frontocentral electrodes were referenced to average. Based on a three-source model of MMN generation, we conducted an MMN source analysis and compared the amplitudes of surface electrodes and sources among groups.
Results
We found no significant differences in MMN amplitudes of surface electrodes. However, significant differences in MMN generation among the four groups were revealed at the frontal source for duration-deviant stimuli (P = 0.01). We also detected a trend-level difference (P = 0.05) in MMN activity among those groups for frequency deviants at the frontal source.
Conclusions
Individuals from the general population who report psychotic symptoms are a heterogeneous group. However, alterations exist in their frontal MMN activity. This increased activity might be an indicator of more sensitive perception regarding changes in the environment for individuals with subclinical psychotic symptoms.
Faced with the effects of trauma, new psychotherapies are emerging in France, converging especially around awareness, experience and emotion. The hypothesis put forward here concerns the complementarities of the two following approaches: Mindfulness, part of a behavioural and cognitive context. EMDR that uses neuroscience through its ABS. The implementation of a protocol based on EMDR and mindfulness, has shown convincing results on the demented elderly person suffering from complex PTSD. The protocol begins with a session devoted to anamnesis and symptoms evaluation. The second phase consists of desensitization and cognitive restructuring. The principal foundations rely on EMDR but also include mindfulness exercises to reduce anxiety due to the effects of therapy or otherwise allow the possibility to bring new material when it seems to encounter a deadlock. The third phase is the consolidation of therapeutic benefits. For this, ABS are based on the patient's resources and meditation exercises are performed in order to amplify the restructuration. The combination of these two therapies could allow to potentiate their respective effects. The single case study that we conducted allowed us to observe encouraging results: reduction of symptoms of revival, autonomic hyper-activation and avoidance. Effects were also observed for co-morbid symptoms namely depression, anxiety and psychotic manifestations. The combination of these two approaches seems profitable and requires replication.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The effects of psychoactive substance abuse are not limited to the user, but extend to the entire family system, with children of substance abusers being particularly at risk. This meta-analysis attempted to quantify the longitudinal relationship between parental alcohol, tobacco, and drug use and child well-being, investigating variation across a range of substance and well-being indices and other potential moderators. We performed a literature search of peer-reviewed, English language, longitudinal observational studies that reported outcomes for children aged 0 to 18 years. In total, 56 studies, yielding 220 dependent effect sizes, met inclusion criteria. A multilevel random-effects model revealed a statistically significant, small detriment to child well-being for parental substance abuse over time (r = .15). Moderator analyses demonstrated that the effect was more pronounced for parental drug use (r = .25), compared with alcohol use (r = .13), tobacco use (r = .13), and alcohol use disorder (r = .14). Results highlight a need for future studies that better capture the effect of parental psychoactive substance abuse on the full breadth of childhood well-being outcomes and to integrate substance abuse into models that specify the precise conditions under which parental behavior determines child well-being.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
Objective:
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Methods:
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Results:
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Conclusions:
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
Piglet mortality has a negative impact on animal welfare and public acceptance. Moreover, the number of weaned piglets per sow mainly determines the profitability of piglet production. Increased litter sizes are associated with lower birth weights and piglet survival. Decreased survival rates and performance of piglets make the control of diseases and infections within pig production even more crucial. Consequently, selection for immunocompetence becomes an important key aspect within modern breeding programmes. However, the phenotypic recording of immune traits is difficult and expensive to realize within farm routines. Even though immune traits show genetic variability, only few examples exist on their respective suitability within a breeding programme and their relationships to economically important production traits. The analysis of immune traits for an evaluation of immunocompetence to gain a generally improved immune response is promising. Generally, in-depth knowledge of the genetic background of the immune system is needed to gain helpful insights about its possible incorporation into breeding programmes. Possible physiological drawbacks for enhanced immunocompetence must be considered with regards to the allocation theory and possible trade-offs between the immune system and performance. This review aims to discuss the relationships between the immunocompetence of the pig, piglet survival as well as the potential of these traits to be included into a breeding strategy for improved robustness.