We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Within the USA military, monitoring body composition is an essential component of predicting physical performance and establishing soldier readiness. The purpose of this study was to explore mobile phone three-dimensional optical imaging (3DO), a user-friendly technology capable of rapidly obtaining reliable anthropometric measurements and to determine the validity of the new Army one-site body fat equations using 3DO-derived abdominal circumference. Ninety-six participants (51 F, 45 M; age: 23·7 ± 6·5 years; BMI: 24·7 ± 4·1 kg/m2) were assessed using 3DO, dual-energy X-ray absorptiometry (DXA) and a 4-compartment model (4C). The validity of the Army equations using 3DO abdominal circumference was compared with 4C and DXA estimates. Compared with the 4C model, the Army equation overestimated BF% and fat mass (FM) by 1·3 ± 4·8 % and 0·9 ± 3·4 kg, respectively, while fat-free mass (FFM) was underestimated by 0·9 ± 3·4 kg (P < 0·01 for each). Values from DXA and Army equation were similar for BF%, FM and FFM (constant errors between −0·1 and 0·1 units; P ≥ 0·82 for each). In both comparisons, notable proportional bias was observed with slope coefficients of −0·08 to −0·43. Additionally, limits of agreement were 9·5–10·2 % for BF% and 6·8–7·8 kg for FM and FFM. Overall, while group-level performance of the one-site Army equation was acceptable, it exhibited notable proportional bias when compared with laboratory criterion methods and wide limits of agreement, indicating potential concerns when applied to individuals. 3DO may provide opportunities for the development of more advanced, automated digital anthropometric body fat estimation in military settings.
Despite proven effectiveness in refractory schizophrenia, clozapine remains underutilised, and it is important to understand potential reasons for this. This study’s aim was to examine in a National sample of Consultant Psychiatrists their knowledge of, attitudes and perceived barriers to clozapine use.
Methods:
A novel questionnaire was designed and distributed by email to 275 Consultant Psychiatrists in Republic of Ireland.
Results:
Twenty-eight percent (n = 77) completed the survey, with 55% of respondents practicing for 15 or more years. Clinicians expressed confidence in managing clozapine treatment and side effects and were well aware of clozapine’s clinical effectiveness and guideline-based use. A majority indicated insufficient experience managing rechallenge and half expressed insufficient experience managing adverse events. Perceived patient factors were highlighted as barriers with 69% of respondents reporting patients’ concern about effectiveness and 50% regarding tolerability. Sixty-four percent (n = 40) indicated that a specialised/tertiary clozapine service would facilitate initiation, with 57% (n = 36) reporting less frequent blood monitoring would aid clozapine prescribing. A majority identified that access to dedicated staff (81%, n = 51) and dedicated day hospital services (84%, n = 53) would facilitate community initiation.
Conclusion:
Consultants are familiar with clozapine use and related guidelines. Dedicated staff and facilities for clozapine use is one identified structural change to enhance clozapine prescribing in Ireland. Tertiary service or clinical advice service would assist in clozapine rechallenge cases or in managing significant adverse events. More structured patient education regarding clozapine effectiveness and professional development programmes focused on managing side effects and rechallenge may promote clozapine use.
Levofloxacin prophylaxis reduces bloodstream infections in neutropenic patients with acute myeloid leukemia or relapsed acute lymphoblastic leukemia. A retrospective, longitudinal cohort study compares incidence of bacteremia, multidrug-resistant organisms (MDRO), and Clostridioides difficile (CDI) between time periods of levofloxacin prophylaxis implementation. Benefits were sustained without increasing MDRO or CDI.
Rhodiola rosea (RR) is a plant whose bioactive components may function as adaptogens, thereby increasing resistance to stress and improving overall resilience. Some of these effects may influence exercise performance and adaptations. Based on studies of rodents, potential mechanisms for the ergogenic effects of RR include modulation of energy substrate stores and use, reductions in fatigue and muscle damage and altered antioxidant activity. At least sixteen investigations in humans have explored the potential ergogenicity of RR. These studies indicate acute RR supplementation (∼200 mg RR containing ∼1 % salidroside and ∼3 % rosavin, provided 60 min before exercise) may prolong time-to-exhaustion and improve time trial performance in recreationally active males and females, with limited documented benefits of chronic supplementation. Recent trials providing higher doses (∼1500 to 2400 mg RR/d for 4–30 d) have demonstrated ergogenic effects during sprints on bicycle ergometers and resistance training in trained and untrained adults. The effects of RR on muscle damage, inflammation, energy system modulation, antioxidant activity and perceived exertion are presently equivocal. Collectively, it appears that adequately dosed RR enhances dimensions of exercise performance and related outcomes for select tasks. However, the current literature does not unanimously show that RR is ergogenic. Variability in supplementation dose and duration, concentration of bioactive compounds, participant characteristics, exercise tests and statistical considerations may help explain these disparate findings. Future research should build on the longstanding use of RR and contemporary clinical trials to establish the conditions in which supplementation facilitates exercise performance and adaptations.
Background: ALS is a progressive neurodegenerative disease without a cure and limited treatment options. Edaravone, a free radical scavenger, was shown to slow disease progression in a select group of patients with ALS over 6 months; however, the effect on survival was not investigated in randomized trials. The objective of this study is to describe real-world survival effectiveness over a longer timeframe. Methods: This retrospective cohort study included patients with ALS across Canada with symptom onset up to three years. Those with a minimum 6-month edaravone exposure between 2017 and 2022 were enrolled in the interventional arm, and those without formed the control arm. The primary outcome of tracheostomy-free survival was compared between the two groups, accounting for age, sex, ALS-disease progression rate, disease duration, pulmonary vital capacity, bulbar ALS-onset, and presence of frontotemporal dementia or C9ORF72 mutation using inverse propensity treatment weights. Results: 182 patients with mean ± SD age 60±11 years were enrolled in the edaravone arm and 860 in the control arm (mean ± SD age 63±12 years). Mean ± SD time from onset to edaravone initiation was 18±10 months. Tracheostomy-free survival will be calculated. Conclusions: This study will provide evidence for edaravone effectiveness on tracheostomy-free survival in patients with ALS.
The purpose of this investigation was to expand upon the limited existing research examining the test–retest reliability, cross-sectional validity and longitudinal validity of a sample of bioelectrical impedance analysis (BIA) devices as compared with a laboratory four-compartment (4C) model. Seventy-three healthy participants aged 19–50 years were assessed by each of fifteen BIA devices, with resulting body fat percentage estimates compared with a 4C model utilising air displacement plethysmography, dual-energy X-ray absorptiometry and bioimpedance spectroscopy. A subset of thirty-seven participants returned for a second visit 12–16 weeks later and were included in an analysis of longitudinal validity. The sample of devices included fourteen consumer-grade and one research-grade model in a variety of configurations: hand-to-hand, foot-to-foot and bilateral hand-to-foot (octapolar). BIA devices demonstrated high reliability, with precision error ranging from 0·0 to 0·49 %. Cross-sectional validity varied, with constant error relative to the 4C model ranging from −3·5 (sd 4·1) % to 11·7 (sd 4·7) %, standard error of the estimate values of 3·1–7·5 % and Lin’s concordance correlation coefficients (CCC) of 0·48–0·94. For longitudinal validity, constant error ranged from −0·4 (sd 2·1) % to 1·3 (sd 2·7) %, with standard error of the estimate values of 1·7–2·6 % and Lin’s CCC of 0·37–0·78. While performance varied widely across the sample investigated, select models of BIA devices (particularly octapolar and select foot-to-foot devices) may hold potential utility for the tracking of body composition over time, particularly in contexts in which the purchase or use of a research-grade device is infeasible.
Few investigations have evaluated the validity of current body composition technology among racially and ethnically diverse populations. This study assessed the validity of common body composition methods in a multi-ethnic sample stratified by race and ethnicity. One hundred and ten individuals (55 % female, age: 26·5 (sd 6·9) years) identifying as Asian, African American/Black, Caucasian/White, Hispanic, Multi-racial and Native American were enrolled. Seven body composition models (dual-energy X-ray absorptiometry (DXA), air displacement plethysmography (ADP), two bioelectrical impedance devices (BIS, IB) and three multi-compartment models) were evaluated against a four-compartment criterion model by assessing total error (TE) and standard error of the estimate. For the total sample, measures of % fat and fat-free mass (FFM) from multi-compartment models were all excellent to ideal (% fat: TE = 0·94–2·37 %; FFM: TE = 0·72–1·78 kg) compared with the criterion. % fat measures were very good to excellent for DXA, ADP and IB (TE = 2·52–2·89 %) and fairly good for BIS (TE = 4·12 %). For FFM, single device estimates were good (BIS; TE = 3·12 kg) to ideal (DXA, ADP, IB; TE = 1·21–2·15 kg). Results did not vary meaningfully between each race and ethnicity, except BIS was not valid for African American/Black, Caucasian/White and Multi-racial participants for % fat (TE = 4·3–4·9 %). The multi-compartment models evaluated can be utilised in a multi-ethnic sample and in each individual race and ethnicity to obtain highly valid results for % fat and FFM. Estimates from DXA, ADP and IB were also valid. The BIS may demonstrate greater TE for all racial and ethnic cohorts and results should be interpreted cautiously.
Poor mental health is a state of psychological distress that is influenced by lifestyle factors such as sleep, diet, and physical activity. Compulsivity is a transdiagnostic phenotype cutting across a range of mental illnesses including obsessive–compulsive disorder, substance-related and addictive disorders, and is also influenced by lifestyle. Yet, how lifestyle relates to compulsivity is presently unknown, but important to understand to gain insights into individual differences in mental health. We assessed (a) the relationships between compulsivity and diet quality, sleep quality, and physical activity, and (b) whether psychological distress statistically contributes to these relationships.
Methods
We collected harmonized data on compulsivity, psychological distress, and lifestyle from two independent samples (Australian n = 880 and US n = 829). We used mediation analyses to investigate bidirectional relationships between compulsivity and lifestyle factors, and the role of psychological distress.
Results
Higher compulsivity was significantly related to poorer diet and sleep. Psychological distress statistically mediated the relationship between poorer sleep quality and higher compulsivity, and partially statistically mediated the relationship between poorer diet and higher compulsivity.
Conclusions
Lifestyle interventions in compulsivity may target psychological distress in the first instance, followed by sleep and diet quality. As psychological distress links aspects of lifestyle and compulsivity, focusing on mitigating and managing distress may offer a useful therapeutic approach to improve physical and mental health. Future research may focus on the specific sleep and diet patterns which may alter compulsivity over time to inform lifestyle targets for prevention and treatment of functionally impairing compulsive behaviors.
The present study reports the validity of multiple assessment methods for tracking changes in body composition over time and quantifies the influence of unstandardised pre-assessment procedures. Resistance-trained males underwent 6 weeks of structured resistance training alongside a hyperenergetic diet, with four total body composition evaluations. Pre-intervention, body composition was estimated in standardised (i.e. overnight fasted and rested) and unstandardised (i.e. no control over pre-assessment activities) conditions within a single day. The same assessments were repeated post-intervention, and body composition changes were estimated from all possible combinations of pre-intervention and post-intervention data. Assessment methods included dual-energy X-ray absorptiometry (DXA), air displacement plethysmography, three-dimensional optical imaging, single- and multi-frequency bioelectrical impedance analysis, bioimpedance spectroscopy and multi-component models. Data were analysed using equivalence testing, Bland–Altman analysis, Friedman tests and validity metrics. Most methods demonstrated meaningful errors when unstandardised conditions were present pre- and/or post-intervention, resulting in blunted or exaggerated changes relative to true body composition changes. However, some methods – particularly DXA and select digital anthropometry techniques – were more robust to a lack of standardisation. In standardised conditions, methods exhibiting the highest overall agreement with the four-component model were other multi-component models, select bioimpedance technologies, DXA and select digital anthropometry techniques. Although specific methods varied, the present study broadly demonstrates the importance of controlling and documenting standardisation procedures prior to body composition assessments across distinct assessment technologies, particularly for longitudinal investigations. Additionally, there are meaningful differences in the ability of common methods to track longitudinal body composition changes.
Using data from a nationally generalisable birth cohort, we aimed to: (i) describe the cohort’s adherence to national evidence-based dietary guidelines using an Infant Feeding Index (IFI) and (ii) assess the IFI’s convergent construct validity, by exploring associations with antenatal maternal socio-demographic and health behaviours and with child overweight/obesity and central adiposity at age 54 months. Data were from the Growing Up in New Zealand cohort (n 6343). The IFI scores ranged from zero to twelve points, with twelve representing full adherence to the guidelines. Overweight/obesity was defined by BMI-for-age (based on the WHO Growth Standards). Central adiposity was defined as waist-to-height ratio > 90th percentile. Associations were tested using multiple linear regression and Poisson regression with robust variance (risk ratios, 95 % CI). Mean IFI score was 8·2 (sd 2·1). Maternal characteristics explained 29·1 % of variation in the IFI score. Maternal age, education and smoking had the strongest independent relationships with IFI scores. Compared with children in the highest IFI tertile, girls in the lowest and middle tertiles were more likely to be overweight/obese (1·46, 1·03, 2·06 and 1·56, 1·09, 2·23, respectively) and boys in the lowest tertile were more likely to have central adiposity (1·53, 1·02, 2·30) at age 54 months. Most infants fell short of meeting national Infant Feeding Guidelines. The associations between IFI score and maternal characteristics, and children’s overweight/obesity/central adiposity, were in the expected directions and confirm the IFI’s convergent construct validity.
The vacuum-exhausted isolation locker (VEIL) provides a safety barrier during the care of COVID-19 patients. The VEIL is a 175-L enclosure with exhaust ports to continuously extract air through viral particle filters connected to hospital suction. Our experiments show that the VEIL contains and exhausts exhaled aerosols and droplets.
Background: With the emergence of antibiotic resistant threats and the need for appropriate antibiotic use, laboratory microbiology information is important to guide clinical decision making in nursing homes, where access to such data can be limited. Susceptibility data are necessary to inform antibiotic selection and to monitor changes in resistance patterns over time. To contribute to existing data that describe antibiotic resistance among nursing home residents, we summarized antibiotic susceptibility data from organisms commonly isolated from urine cultures collected as part of the CDC multistate, Emerging Infections Program (EIP) nursing home prevalence survey. Methods: In 2017, urine culture and antibiotic susceptibility data for selected organisms were retrospectively collected from nursing home residents’ medical records by trained EIP staff. Urine culture results reported as negative (no growth) or contaminated were excluded. Susceptibility results were recorded as susceptible, non-susceptible (resistant or intermediate), or not tested. The pooled mean percentage tested and percentage non-susceptible were calculated for selected antibiotic agents and classes using available data. Susceptibility data were analyzed for organisms with ≥20 isolates. The definition for multidrug-resistance (MDR) was based on the CDC and European Centre for Disease Prevention and Control’s interim standard definitions. Data were analyzed using SAS v 9.4 software. Results: Among 161 participating nursing homes and 15,276 residents, 300 residents (2.0%) had documentation of a urine culture at the time of the survey, and 229 (76.3%) were positive. Escherichia coli, Proteus mirabilis, Klebsiella spp, and Enterococcus spp represented 73.0% of all urine isolates (N = 278). There were 215 (77.3%) isolates with reported susceptibility data (Fig. 1). Of these, data were analyzed for 187 (87.0%) (Fig. 2). All isolates tested for carbapenems were susceptible. Fluoroquinolone non-susceptibility was most prevalent among E. coli (42.9%) and P. mirabilis (55.9%). Among Klebsiella spp, the highest percentages of non-susceptibility were observed for extended-spectrum cephalosporins and folate pathway inhibitors (25.0% each). Glycopeptide non-susceptibility was 10.0% for Enterococcus spp. The percentage of isolates classified as MDR ranged from 10.1% for E. coli to 14.7% for P. mirabilis. Conclusions: Substantial levels of non-susceptibility were observed for nursing home residents’ urine isolates, with 10% to 56% reported as non-susceptible to the antibiotics assessed. Non-susceptibility was highest for fluoroquinolones, an antibiotic class commonly used in nursing homes, and ≥ 10% of selected isolates were MDR. Our findings reinforce the importance of nursing homes using susceptibility data from laboratory service providers to guide antibiotic prescribing and to monitor levels of resistance.
Background: Antibiotics are among the most commonly prescribed drugs in nursing homes; urinary tract infections (UTIs) are a frequent indication. Although there is no gold standard for the diagnosis of UTIs, various criteria have been developed to inform and standardize nursing home prescribing decisions, with the goal of reducing unnecessary antibiotic prescribing. Using different published criteria designed to guide decisions on initiating treatment of UTIs (ie, symptomatic, catheter-associated, and uncomplicated cystitis), our objective was to assess the appropriateness of antibiotic prescribing among NH residents. Methods: In 2017, the CDC Emerging Infections Program (EIP) performed a prevalence survey of healthcare-associated infections and antibiotic use in 161 nursing homes from 10 states: California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. EIP staff reviewed resident medical records to collect demographic and clinical information, infection signs, symptoms, and diagnostic testing documented on the day an antibiotic was initiated and 6 days prior. We applied 4 criteria to determine whether initiation of treatment for UTI was supported: (1) the Loeb minimum clinical criteria (Loeb); (2) the Suspected UTI Situation, Background, Assessment, and Recommendation tool (UTI SBAR tool); (3) adaptation of Infectious Diseases Society of America UTI treatment guidelines for nursing home residents (Crnich & Drinka); and (4) diagnostic criteria for uncomplicated cystitis (cystitis consensus) (Fig. 1). We calculated the percentage of residents for whom initiating UTI treatment was appropriate by these criteria. Results: Of 248 residents for whom UTI treatment was initiated in the nursing home, the median age was 79 years [IQR, 19], 63% were female, and 35% were admitted for postacute care. There was substantial variability in the percentage of residents with antibiotic initiation classified as appropriate by each of the criteria, ranging from 8% for the cystitis consensus, to 27% for Loeb, to 33% for the UTI SBAR tool, to 51% for Crnich and Drinka (Fig. 2). Conclusions: Appropriate initiation of UTI treatment among nursing home residents remained low regardless of criteria used. At best only half of antibiotic treatment met published prescribing criteria. Although insufficient documentation of infection signs, symptoms and testing may have contributed to the low percentages observed, adequate documentation in the medical record to support prescribing should be standard practice, as outlined in the CDC Core Elements of Antibiotic Stewardship for nursing homes. Standardized UTI prescribing criteria should be incorporated into nursing home stewardship activities to improve the assessment and documentation of symptomatic UTI and to reduce inappropriate antibiotic use.
The Eating Assessment in Toddlers FFQ (EAT FFQ) has been shown to have good reliability and comparative validity for ranking nutrient intakes in young children. With the addition of food items (n 4), we aimed to re-assess the validity of the EAT FFQ and estimate calibration factors in a sub-sample of children (n 97) participating in the Growing Up Milk – Lite (GUMLi) randomised control trial (2015–2017). Participants completed the ninety-nine-item GUMLi EAT FFQ and record-assisted 24-h recalls (24HR) on two occasions. Energy and nutrient intakes were assessed at months 9 and 12 post-randomisation and calibration factors calculated to determine predicted estimates from the GUMLi EAT FFQ. Validity was assessed using Pearson correlation coefficients, weighted kappa (κ) and exact quartile categorisation. Calibration was calculated using linear regression models on 24HR, adjusted for sex and treatment group. Nutrient intakes were significantly correlated between the GUMLi EAT FFQ and 24HR at both time points. Energy-adjusted, de-attenuated Pearson correlations ranged from 0·3 (fibre) to 0·8 (Fe) at 9 months and from 0·3 (Ca) to 0·7 (Fe) at 12 months. Weighted κ for the quartiles ranged from 0·2 (Zn) to 0·6 (Fe) at 9 months and from 0·1 (total fat) to 0·5 (Fe) at 12 months. Exact agreement ranged from 30 to 74 %. Calibration factors predicted up to 56 % of the variation in the 24HR at 9 months and 44 % at 12 months. The GUMLi EAT FFQ remained a useful tool for ranking nutrient intakes with similar estimated validity compared with other FFQ used in children under 2 years.
Introduction: Emergency department (ED) crowding is a major problem across Canada. We studied the ability of artificial intelligence methods to improve patient flow through the ED by predicting patient disposition using information available at triage and shortly after patients’ arrival in the ED. Methods: This retrospective study included all visits to an urban, academic, adult ED between May 2012 and June 2019. For each visit, 489 variables were extracted including triage data that had been collected for use in the Canadian Triage Assessment Scale (CTAS) and information regarding laboratory tests, radiological tests, consultations and admissions. A training set consisting of all visits from April 2012 up to December 2018 was used to train 5 classes of machine learning models to predict admission to the hospital from the ED. The models were trained to predict admission at the time of the patient's arrival in the ED and every 30 minutes after arrival until 6 hours into their ED stay. The performance of models was compared using the area under the ROC curve (AUC) on a test set consisting of all visits from January 2019 to June 2019. Results: The study included 536,332 visits and the admission rate was 15.0%. Gradient boosting models generally outperformed other machine learning models. A gradient boosting model using all available data at 2 hours after patient arrival in the ED yielded a test set AUC 0.92 [95% CI 0.91-0.93], while a model using only data available at triage yielded an AUC 0.90 [95% CI 0.89-0.91]. The quality of predictions generally improved as predictions were made later in the patient's ED stay leading to an AUC 0.95 [95% CI 0.93-0.96] at 6 hours after arrival. A gradient boosting model with 20 variables available at 2 hours after patient arrival in the ED yielded an AUC 0.91 [95% CI 0.89-0.93]. A gradient boosting model that makes predictions at 2 hours after arrival in ED using only variables that are available at all EDs in the province of Quebec yielded an AUC 0.91 [95% 0.89-0.92]. Conclusion: Machine learning can predict admission to a hospital from the ED using variables that area collected as part of routine ED care. Machine learning tools may potentially be used to help ED physicians to make faster and more appropriate disposition decisions, to decrease unnecessary testing and alleviate ED crowding.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
There are no estimates of the heritability of phenotypic udder traits in suckler sheep, which produce meat lambs, and whether these are associated with resilience to mastitis. Mastitis is a common disease which damages the mammary gland and reduces productivity. The aims of this study were to investigate the feasibility of collecting udder phenotypes, their heritability and their association with mastitis in suckler ewes. Udder and teat conformation, teat lesions, intramammary masses (IMM) and litter size were recorded from 10 Texel flocks in Great Britain between 2012 and 2014; 968 records were collected. Pedigree data were obtained from an online pedigree recording system. Univariate quantitative genetic parameters were estimated using animal and sire models. Linear mixed models were used to analyse continuous traits and generalised linear mixed models were used to analyse binary traits. Continuous traits had higher heritabilities than binary with teat placement and teat length heritability (h2) highest at 0.35 (SD 0.04) and 0.42 (SD 0.04), respectively. Udder width, drop and separation heritabilities were lower and varied with udder volume. The heritabilities of IMM and teat lesions (sire model) were 0.18 (SD 0.12) and 0.17 (SD 0.11), respectively. All heritabilities were sufficiently high to be in a selection programme to increase resilience to mastitis in the population of Texel sheep. Further studies are required to investigate genetic relationships between traits and to determine whether udder traits predict IMM, and the potential benefits from including traits in a selection programme to increase resilience to chronic mastitis.