We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Nipocalimab (a fully human, effectorless anti-neonatal Fc receptor (FcRn) monoclonal antibody) may ameliorate gMG disease manifestations by selectively targeting FcRn IgG recycling and lowering IgG, including pathogenic autoantibodies in generalized myasthenia gravis (gMG). The objective was to evaluate the effectiveness and safety of intravenous nipocalimab added to background standard-of-care therapy in adolescents with gMG. Methods: Seropositive patients (12-<18 years) with gMG (MGFA Class II-IV) on stable therapy but inadequately controlled, were enrolled in a 24-week open label study. Nipocalimab was administered as a 30 mg/kg IV loading dose followed by 15 mg/kg IV every 2 Weeks. Results: Seven adolescents were enrolled; 5 completed 24-weeks of dosing. The mean(SD) age was 14.1(1.86) years; seven were anti-AChR+, six were female. Mean(SD) baseline MG-ADL/QMG scores were 4.29(2.430)/12.50(3.708). Nipocalimab showed a significant reduction in total serum IgG at week-24; the mean(SD) change from baseline to week-24 for total serum IgG was -68.98%(7.561). The mean(SD) change in MG-ADL/QMG scores at week-24 was -2.40(0.418)/-3.80(2.683); 4 of 5 patients achieved minimum symptom expression (MG-ADL score 0-1) by week-24. Nipocalimab was well-tolerated; there were no serious adverse events. There were no clinically meaningful laboratory changes. Conclusions: Nipocalimab demonstrated efficacy and safety in this 6-month trial in seropositive adolescents with gMG.
Dietary nitrate is a precursor to nitric oxide, for which plausible mechanisms exist for both beneficial and detrimental influences in multiple sclerosis (MS)(1,2). Whether dietary nitrate has any role in MS onset is unclear. We aimed to test associations between nitrate intake from food sources (plant, vegetable, animal, processed meat, and unprocessed meat) and likelihood of a first clinical diagnosis of central nervous system demyelination (FCD). We used data from the Ausimmune Study (264 cases, 474 controls). Case participants (aged 19–59 years) presenting to medical professionals in four latitudinally different regions of Australia were referred to the study with an FCD. The Australian Electoral Roll was used to recruit one to four controls per case, matched by age (± 2 years), sex and study region. Habitual dietary intake representing the 12-month period preceding the study interview was assessed to determine dietary nitrate intake. In addition to matching variables, data on education, smoking history, and history of infectious mononucleosis, weight and height were collected. A blood sample was taken for measurement of serum 25-hydroxyvitamin D concentration, which was de-seasonalised. To test associations, we used logistic regression with full propensity score matching. We used two levels of covariate matching: in model 1, cases and controls were matched on the original matching variables (age, sex, and study region); in model 2, cases and controls were additionally matched on well-established/potential risk factors for MS (education, smoking history, and history of infectious mononucleosis) and dietary factors (total energy intake and dietary misreporting). In females only (n = 573; 368 controls and 205 cases), higher nitrate intake (per 60 mg/day) from plant-based foods (fully adjusted odds ratio [aOR] = 0.50, 95% CI, 0.31, 0.81, p < 0.01) or vegetables (aOR = 0.44, 95% CI, 0.27, 0.73, p < 0.01) was statistically significantly associated with lower likelihood of FCD. No association was found between nitrate intake (any sources) and likelihood of FCD in males. To our knowledge, this is the first study to investigate dietary nitrate intake in relation to FCD. Our result that higher intake of nitrate from plant-based foods (mainly vegetables) was associated with lower likelihood of FCD in females supports our previous findings showing that following a Mediterranean diet (rich in vegetables) associates with lower likelihood of FCD(3). The lack of association in males may be due to low statistical power and/or differing food preferences and pathological processes among males and females. Our results support further research to delineate the independent effect of nitrates form other dietary factors and explore a possible beneficial role for plant-derived nitrate in people at high risk of MS.
Low vitamin D status (circulating 25-hydroxyvitamin D [25(OH)D] concentration < 50 nmol/L) affects nearly one in four Australian adults(1). The primary source of vitamin D is sun exposure; however, a safe level of sun exposure for optimal vitamin D production has not been established. As supplement use is uneven, increasing vitamin D in food is the logical option for improving vitamin D status at a population level. The dietary supply of vitamin D is low since few foods are naturally rich in vitamin D. While there is no Australia-specific estimated average requirement (EAR) for vitamin D, the Institute of Medicine recommends an EAR of 10 μg/day for all ages. Vitamin D intake is low in Australia, with mean usual intake ranging from 1.8–3.2 μg/day across sex/age groups(2), suggesting a need for data-driven nutrition policy to improve the dietary supply of vitamin D. Food fortification has proven effective in other countries. We aimed to model four potential vitamin D fortification scenarios to determine an optimal strategy for Australia. We used food consumption data for people aged ≥ 2 years (n = 12,153) from the 2011–2012 National Nutrition and Physical Activity Survey, and analytical food composition data for vitamin D3, 25(OH)D3, vitamin D2 and 25(OH)D2(3). Certain foods are permitted for mandatory or voluntary fortification in Australia. As industry uptake of the voluntary option is low, Scenario 1 simulated addition of the maximum permitted amount of vitamin D to all foods permitted under the Australia New Zealand Food Standards Code (dairy products/plant-based alternatives, edible oil spreads, formulated beverages and permitted ready-to-eat breakfast cereals (RTEBC)). Scenarios 2–4 modelled higher concentrations than those permitted for fluid milk/alternatives (1 μg/100 mL) and edible oil spreads (20 μg/100 g) within an expanding list of food vehicles: Scenario 2—dairy products/alternatives, edible oil spreads, formulated beverages; Scenario 3—Scenario 2 plus RTEBC; Scenario 4—Scenario 3 plus bread (which is not permitted for vitamin D fortification in Australia). Usual intake was modelled for the four scenarios across sex and age groups using the National Cancer Institute Method(4). Assuming equal bioactivity of the D vitamers, the range of mean usual vitamin D intake across age groups for males for Scenarios 1 to 4, respectively, was 7.2–8.8, 6.9–8.3, 8.0–9.7 and 9.3–11.3 μg/day; the respective values for females were 5.8–7.5, 5.8–7.2, 6.4–8.3 and 7.5–9.5 μg/day. No participant exceeded the upper level of intake (80 μg/day) under any scenario. Systematic fortification of all foods permitted for vitamin D fortification could substantially improve vitamin D intake across the population. However, the optimal strategy would require permissions for bread as a food vehicle, and addition of higher than permitted concentrations of vitamin D to fluid milks/alternatives and edible oil spreads.
Fatigue is a prevalent symptom in people with multiple sclerosis (MS), significantly impacting quality of life and daily functioning(1). The Mediterranean diet, with its anti-inflammatory and neuroprotective properties, may help to alleviate fatigue(2). However, existing evidence linking the Mediterranean diet to fatigue in people with MS is primarily cross-sectional, providing limited insights into long-term effects(3). This study aimed to prospectively test associations between the alternate Mediterranean diet score (aMED) and fatigue using data from the United Kingdom (UK) Multiple Sclerosis Register. Dietary intake was measured in 2016 (n = 2,455) and 2022 (n = 3,740) using the EPIC-Norfolk 130-item Food Frequency Questionnaire. A total of 879 participants provided dietary intake data at both timepoints. aMED is a score ranging from 0 to 9, with higher scores indicating greater adherence to a Mediterranean diet (higher consumption of vegetables, fruit, nuts, legumes, whole-grains, fish; greater monounsaturated-to-saturated fat ratio; lower consumption of red meat; moderate consumption of alcohol). Fatigue was measured using the Fatigue Severity Scale (FSS), a 9-item questionnaire that evaluates the extent to which fatigue interferes with daily activities, with scores ranging from 1 to 7. Additionally, fatigue levels were categorised as either ‘high’ (FSS score: ≥ 5) or ‘low’ (FSS score: < 5). The association between aMED and fatigue over six years (from 2016 to 2022) was assessed using generalized mixed-effects models for the continuous FSS score, while mixed-effects logistic regression models were employed to examine the association between aMED and binary FSS categories (high/low). The models were adjusted for age, sex, MS type (benign, relapse-remitting, secondary progressive, primary progressive, unknown), and total energy intake (kcal/day). Analysis was restricted to those participating at both timepoints and who had complete data on diet, covariates, and FSS (n = 379). The study population consisted of 71.5% females, with a mean age of 55.0 years (standard deviation, 9.9). Higher aMED scores (one-unit increase) were significantly associated with lower FSS scores (adjusted β = -0.08; 95% CI: -0.14, -0.03; p = 0.004) and with 17% lower odds of having high fatigue (FSS > 5), with an adjusted odds ratio of 0.83 (95% CI: 0.71, 0.98; p = 0.029). These findings suggest that adherence to a Mediterranean diet may play a protective role in reducing fatigue severity in people with MS over a 6-year period. Further research could explore associations between a Mediterranean diet and other MS related outcomes in this study population.
From early on, infants show a preference for infant-directed speech (IDS) over adult-directed speech (ADS), and exposure to IDS has been correlated with language outcome measures such as vocabulary. The present multi-laboratory study explores this issue by investigating whether there is a link between early preference for IDS and later vocabulary size. Infants’ preference for IDS was tested as part of the ManyBabies 1 project, and follow-up CDI data were collected from a subsample of this dataset at 18 and 24 months. A total of 341 (18 months) and 327 (24 months) infants were tested across 21 laboratories. In neither preregistered analyses with North American and UK English, nor exploratory analyses with a larger sample did we find evidence for a relation between IDS preference and later vocabulary. We discuss implications of this finding in light of recent work suggesting that IDS preference measured in the laboratory has low test-retest reliability.
OBJECTIVES/GOALS: In this study, we aim to report the role of porins and blaCTX-M β-lactamases among Escherichia coli and Klebsiella pneumoniae, focusing on emerging carbapenem resistant Enterobacterales (CRE) subtypes, including non-carbapenemase producing Enterobacterales (NCPE) and ertapenem-resistant but meropenem-susceptible (ErMs) strains. METHODS/STUDY POPULATION: Whole genome sequencing was conducted on 76 carbapenem-resistant isolates across 5 hospitals in San Antonio, U.S. Among these, NCP isolates accounted for the majority of CRE (41/76). Identification and antimicrobial susceptibility testing (AST) results were collected from the clinical charts. Repeat speciation was determined through whole genome sequencing (WGS) analysis and repeat AST, performed with microdilution or ETEST®. Minimum inhibitory concentrations (MIC) were consistent with Clinical and Laboratory Standards Institute (CLSI M100, ED33). WGS and qPCR were used to characterize the resistome of all clinical CRE subtypes, while western blotting and liquid chromatography with tandem mass spectrometry (LC-MS-MS) were used to determine porin expression and carbapenem hydrolysis, respectively. RESULTS/ANTICIPATED RESULTS: blaCTX-Mwas found to be most prevalent among NCP isolates (p = 0.02). LC-MS/MS analysis of carbapenem hydrolysis revealed that blaCTX-M-mediated carbapenem hydrolysis, indicating the need to reappraise the term, “non-carbapenemase (NCP)®” for quantitatively uncharacterized CRE strains harboring blaCTX-M. Susceptibility results showed that 56% of all NCPE isolates had an ErMs phenotype (NCPE vs. CPE, p < 0.001), with E. coli driving the phenotype (E. coli vs. K. pneumoniae, p < 0.001). ErMs strains carrying blaCTX-M, had 4-fold more copies of blaCTX-M than ceftriaxone-resistant but ertapenem-susceptible isolates (3.7 v. 0.9, p < 0.001). Immunoblot analysis demonstrated the absence of OmpC expression in NCP-ErMs E. coli, with 92% of strains lacking full contig coverage ofompC. DISCUSSION/SIGNIFICANCE: Overall, this work provides evidence of a collaborative effort between blaCTX-M and OmpC in NCP strains that confer resistance to ertapenem but not meropenem. Clinically, CRE subtypes are not readily appreciated, potentially leading to mismanagement of CRE infected patients. A greater focus on optimal treatments for CRE subtypes is needed.
To describe breastfeeding rates from early to late infancy and to examine associations between breastfeeding duration and infant growth, including rapid weight gain (RWG, > 0·67 SD increase in weight-for-age Z-score), among infants from low-income, racially and ethnically diverse backgrounds.
Design:
A short, prospective cohort study was conducted assessing breastfeeding status at infant ages 2, 4, 6, 9 and 12 months. Infant length and weight measurements were retrieved from electronic health records to calculate weight-for-length Z-scores and the rate of weight gain.
Setting:
Pediatric clinic in the Southeastern USA.
Participants:
Mother-infant dyads (n = 256).
Results:
Most participants were African American (48 %) or Latina (34 %). Eighty-one per cent were participating in the Special Supplemental Nutrition Program for Women, Infants and Children. Infants were breastfed for a median duration of 4·75 months, with partial more common than exclusive breastfeeding. At 12 months, 28 % of the participants were breastfeeding. Infants breastfed beyond 6 months had significantly lower growth trajectories than infants breastfed for 0–2 months (β = 0·045, se = 0·013, P = 0·001) or 3–6 months (β = 0·054, se = 0·016, P = 0·001). Thirty-six per cent of the infants experienced RWG. RWG was more common among infants who were breastfed for 2 months or less than 6+ month breastfed group (relative risk = 1·68, CI95 (1·03, 2·74), P = 0·03).
Conclusions:
Breastfeeding beyond 6 months is associated with the prevention of accelerated growth among infants from low-income, racially and ethnically diverse backgrounds, suggesting progress toward health equity.
Background: Saccade and pupil responses are potential neurodegenerative disease biomarkers due to overlap between oculomotor circuitry and disease-affected areas. Instruction-based tasks have previously been examined as biomarker sources, but are arduous for patients with limited cognitive abilities; additionally, few studies have evaluated multiple neurodegenerative pathologies concurrently. Methods: The Ontario Neurodegenerative Disease Research Initiative recruited individuals with Alzheimer’s disease (AD), mild cognitive impairment (MCI), amyotrophic lateral sclerosis (ALS), frontotemporal dementia, progressive supranuclear palsy, or Parkinson’s disease (PD). Patients (n=274, age 40-86) and healthy controls (n=101, age 55-86) viewed 10 minutes of frequently changing video clips without instruction while their eyes were tracked. We evaluated differences in saccade and pupil parameters (e.g. saccade frequency and amplitude, pupil size, responses to clip changes) between groups. Results: Preliminary data indicates low-level behavioural alterations in multiple disease cohorts: increased centre bias, lower overall saccade rate and reduced saccade amplitude. After clip changes, patient groups generally demonstrated lower saccade rate but higher microsaccade rate following clip change to varying degrees. Additionally, pupil responses were blunted (AD, MCI, ALS) or exaggerated (PD). Conclusions: This task may generate behavioural biomarkers even in cognitively impaired populations. Future work should explore the possible effects of factors such as medication and disease stage.
To compare the prognostic value of mid-upper arm circumference (MUAC), weight-for-height Z-score (WHZ) and weight-for-age Z-score (WAZ) for predicting death over periods of 1, 3 and 6 months follow-up in children.
Design:
Pooled analysis of twelve prospective studies examining survival after anthropometric assessment. Sensitivity and false-positive ratios to predict death within 1, 3 and 6 months were compared for three individual anthropometric indices and their combinations.
Setting:
Community-based, prospective studies from twelve countries in Africa and Asia.
Participants:
Children aged 6–59 months living in the study areas.
Results:
For all anthropometric indices, the receiver operating characteristic curves were higher for shorter than for longer durations of follow-up. Sensitivity was higher for death with 1-month follow-up compared with 6 months by 49 % (95 % CI (30, 69)) for MUAC < 115 mm (P < 0·001), 48 % (95 % CI (9·4, 87)) for WHZ < -3 (P < 0·01) and 28 % (95 % CI (7·6, 42)) for WAZ < -3 (P < 0·005). This was accompanied by an increase in false positives of only 3 % or less. For all durations of follow-up, WAZ < -3 identified more children who died and were not identified by WHZ < -3 or by MUAC < 115 mm, 120 mm or 125 mm, but the use of WAZ < -3 led to an increased false-positive ratio up to 16·4 % (95 % CI (12·0, 20·9)) compared with 3·5 % (95 % CI (0·4, 6·5)) for MUAC < 115 mm alone.
Conclusions:
Frequent anthropometric measurements significantly improve the identification of malnourished children with a high risk of death without markedly increasing false positives. Combining two indices increases sensitivity but also increases false positives among children meeting case definitions.
Social anxiety disorder (SAD) is a common mental health condition that is characterised by a persistent fear of social or performance situations. Despite effective treatments being available, many individuals with SAD do not seek treatment or delay treatment seeking for many years. The aim of the present study was to examine treatment barriers, treatment histories, and cognitive behavioural therapy (CBT) delivery preferences in a sample of women with clinically relevant SAD symptoms. Ninety-nine women (Mage = 34.90, SD = 11.28) completed the online questionnaires and were included in the study. Participants were recruited from advertisements on community noticeboards and posts on social media. The results demonstrated that less than 5% of those who received psychological treatment in the past were likely to have received best-practice CBT. The most commonly cited barriers to accessing treatment for women with SAD related to direct costs (63%) and indirect costs (e.g., transport/childcare) (28%). The most preferred treatment delivery method overall was individual face-to-face treatment (70%). The study demonstrates a need to provide a variety of treatment options in order to enhance access to empirically supported treatment for women with SAD.
To improve dissemination and accessibility of guidelines to healthcare providers at our institution, guidance for infectious syndromes was incorporated into an electronic application (e-app). The objective of this study was to compare empiric antimicrobial prescribing before and after implementation of the e-app.
Design:
This study was a before-and-after trial.
Setting:
A tertiary-care, public hospital in Halifax, Canada.
Participants:
This study included pediatric patients admitted to hospital who were empirically prescribed an antibiotic for an infectious syndrome listed in the e-app.
Methods:
Data were collected from medical records. Prescribing was independently assessed considering patient-specific characteristics using a standardized checklist by 2 members of the research team. Assessments of antimicrobial prescribing were compared, and discrepancies were resolved through discussion. Empiric antimicrobial prescribing before and after implementation of the e-app was compared using interrupted time-series analysis.
Results:
In total, 237 patients were included in the preimplementation arm and 243 patients were included in the postimplementation arm. Pneumonia (23.8%), appendicitis (19.2%), and sepsis (15.2%) were the most common indications for antimicrobial use. Empiric antimicrobial use was considered optimal in 195 (81.9%) of 238 patients before implementation compared to 226 (93.0%) 243 patients after implementation. An immediate 15.5% improvement (P = .019) in optimal antimicrobial prescribing was observed following the implementation of the e-app.
Conclusions:
Empiric antimicrobial prescribing for pediatric patients with infectious syndromes improved after implementation of an e-app for dissemination of clinical practice guidelines. The use of e-apps may also be an effective strategy to improve antimicrobial use in other patient populations.
Quantifying the marine radiocarbon reservoir effect, offsets (ΔR), and ΔR variability over time is critical to improving dating estimates of marine samples while also providing a proxy of water mass dynamics. In the northeastern Pacific, where no high-resolution time series of ΔR has yet been established, we sampled radiocarbon (14C) from exactly dated growth increments in a multicentennial chronology of the long-lived bivalve, Pacific geoduck (Paneopea generosa) at the Tree Nob site, coastal British Columbia, Canada. Samples were taken at approximately decadal time intervals from 1725 CE to 1920 CE and indicate average ΔR values of 256 ± 22 years (1σ) consistent with existing discrete estimates. Temporal variability in ΔR is small relative to analogous Atlantic records except for an unusually old-water event, 1802–1812. The correlation between ΔR and sea surface temperature (SST) reconstructed from geoduck increment width is weakly significant (r2 = .29, p = .03), indicating warm water is generally old, when the 1802–1812 interval is excluded. This interval contains the oldest (–2.1σ) anomaly, and that is coincident with the coldest (–2.7σ) anomalies of the temperature reconstruction. An additional 32 14C values spanning 1952–1980 were detrended using a northeastern Pacific bomb pulse curve. Significant positive correlations were identified between the detrended 14C data and annual El Niño Southern Oscillation (ENSO) and summer SST such that cooler conditions are associated with older water. Thus, 14C is generally relatively stable with weak, potentially inconsistent associations to climate variables, but capable of infrequent excursions as illustrated by the unusually cold, old-water 1802–1812 interval.
The legal brief is a primary vehicle by which lawyers seek to persuade appellate judges. Despite wide acceptance that briefs are important, empirical scholarship has yet to establish their influence on the Supreme Court or fully explore justices’ preferences regarding them. We argue that emotional language conveys a lack of credibility to justices and thereby diminishes the party’s likelihood of garnering justices’ votes. The data concur. Using an automated textual analysis program, we find that parties who employ less emotional language in their briefs are more likely to win a justice’s vote, a result that holds even after controlling for other features correlated with success, such as case quality. These findings suggest that advocates seeking to influence judges can enhance their credibility and attract justices’ votes by employing measured, objective language.
Background: Eye movements reveal neurodegenerative disease processes due to overlap between oculomotor circuitry and disease-affected areas. Characterizing oculomotor behaviour in context of cognitive function may enhance disease diagnosis and monitoring. We therefore aimed to quantify cognitive impairment in neurodegenerative disease using saccade behaviour and neuropsychology. Methods: The Ontario Neurodegenerative Disease Research Initiative recruited individuals with neurodegenerative disease: one of Alzheimer’s disease, mild cognitive impairment, amyotrophic lateral sclerosis, frontotemporal dementia, Parkinson’s disease, or cerebrovascular disease. Patients (n=450, age 40-87) and healthy controls (n=149, age 42-87) completed a randomly interleaved pro- and anti-saccade task (IPAST) while their eyes were tracked. We explored the relationships of saccade parameters (e.g. task errors, reaction times) to one another and to cognitive domain-specific neuropsychological test scores (e.g. executive function, memory). Results: Task performance worsened with cognitive impairment across multiple diseases. Subsets of saccade parameters were interrelated and also differentially related to neuropsychology-based cognitive domain scores (e.g. antisaccade errors and reaction time associated with executive function). Conclusions: IPAST detects global cognitive impairment across neurodegenerative diseases. Subsets of parameters associate with one another, suggesting disparate underlying circuitry, and with different cognitive domains. This may have implications for use of IPAST as a cognitive screening tool in neurodegenerative disease.
Background:Candida auris is an emerging multidrug-resistant yeast that is transmitted in healthcare facilities and is associated with substantial morbidity and mortality. Environmental contamination is suspected to play an important role in transmission but additional information is needed to inform environmental cleaning recommendations to prevent spread. Methods: We conducted a multiregional (Chicago, IL; Irvine, CA) prospective study of environmental contamination associated with C. auris colonization of patients and residents of 4 long-term care facilities and 1 acute-care hospital. Participants were identified by screening or clinical cultures. Samples were collected from participants’ body sites (eg, nares, axillae, inguinal creases, palms and fingertips, and perianal skin) and their environment before room cleaning. Daily room cleaning and disinfection by facility environmental service workers was followed by targeted cleaning of high-touch surfaces by research staff using hydrogen peroxide wipes (see EPA-approved product for C. auris, List P). Samples were collected immediately after cleaning from high-touch surfaces and repeated at 4-hour intervals up to 12 hours. A pilot phase (n = 12 patients) was conducted to identify the value of testing specific high-touch surfaces to assess environmental contamination. High-yield surfaces were included in the full evaluation phase (n = 20 patients) (Fig. 1). Samples were submitted for semiquantitative culture of C. auris and other multidrug-resistant organisms (MDROs) including methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Enterococcus (VRE), extended-spectrum β-lactamase–producing Enterobacterales (ESBLs), and carbapenem-resistant Enterobacterales (CRE). Times to room surface contamination with C. auris and other MDROs after effective cleaning were analyzed. Results:Candida auris colonization was most frequently detected in the nares (72%) and palms and fingertips (72%). Cocolonization of body sites with other MDROs was common (Fig. 2). Surfaces located close to the patient were commonly recontaminated with C. auris by 4 hours after cleaning, including the overbed table (24%), bed handrail (24%), and TV remote or call button (19%). Environmental cocontamination was more common with resistant gram-positive organisms (MRSA and, VRE) than resistant gram-negative organisms (Fig. 3). C. auris was rarely detected on surfaces located outside a patient’s room (1 of 120 swabs; <1%). Conclusions: Environmental surfaces near C. auris–colonized patients were rapidly recontaminated after cleaning and disinfection. Cocolonization of skin and environment with other MDROs was common, with resistant gram-positive organisms predominating over gram-negative organisms on environmental surfaces. Limitations include lack of organism sequencing or typing to confirm environmental contamination was from the room resident. Rapid recontamination of environmental surfaces after manual cleaning and disinfection suggests that alternate mitigation strategies should be evaluated.
Contemporary conservation professionals are part of a workforce focused on overcoming complex challenges under great time pressure. The characteristics of conservation work, and in particular the evolving demands placed on the workforce, mean that to remain effective these professionals need to enhance their skills and abilities continually. Currently, there are no sector-wide guidelines to promote systematic professional development that addresses both individual and organizational learning. This study builds upon existing knowledge from other sectors by examining professional development in conservation through an in-depth qualitative thematic analysis of interviews with 22 conservation professionals, resulting in an effectiveness framework for professional development in the conservation sector. Our findings indicate how individuals’ motivation to learn, proactivity, open-mindedness towards alternative information and views were considered preconditions for effective professional development. A balance between organizational goals and career ambitions was found essential to maintain this motivation to learn and vital for staff retention and preservation of institutional knowledge. Professional development plans may help distinguish between individual career aspirations and organizational objectives and aid a discussion between staff and management on how to balance the two. Leaders have the opportunity to remove barriers to effective professional development. We discuss solutions to overcome specific barriers, to promote an inclusive approach for diverse learners through provision of opportunities, effective learning design, and resource distribution for professional development. This effectiveness framework can be used by conservationists and conservation organizations to plan and decide on professional development.
The objective was to examine risk and protective factors associated with pre- to early-pandemic changes in risk of household food insecurity (FI).
Design:
We re-enrolled families from two statewide studies (2017–2020) in an observational cohort (May–August 2020). Caregivers reported on risk of household FI, demographics, pandemic-related hardships, and participation in safety net programmes (e.g. Coronavirus Aid, Relief, and Economic Security (CARES) stimulus payment, school meals).
Setting:
Maryland, USA.
Participants:
Economically, geographically and racially/ethnically diverse families with preschool to adolescent-age children. Eligibility included reported receipt or expected receipt of the CARES stimulus payment or a pandemic-related economic hardship (n 496).
Results:
Prevalence of risk of FI was unchanged (pre-pandemic: 22 %, early-pandemic: 25 %, p = 0·27). Risk of early-pandemic FI was elevated for non-Hispanic Black (adjusted relative risk (aRR) = 2·1 (95 % CI 1·1, 4·0)) and Other families (aRR = 2·6 (1·3, 5·4)) and families earning ≤ 300 % federal poverty level. Among pre-pandemic food secure families, decreased income, job loss and reduced hours were associated with increased early-pandemic FI risk (aRR = 2·1 (1·2, 3·6) to 2·5 (1·5, 4·1)); CARES stimulus payment (aRR = 0·5 (0·3, 0·9)) and continued school meal participation (aRR = 0·2 (0·1, 0·9)) were associated with decreased risk. Among families at risk of FI pre-pandemic, safety net programme participation was not associated with early-pandemic FI risk.
Conclusions:
The CARES stimulus payment and continued school meal participation protected pre-pandemic food secure families from early-pandemic FI risk but did not protect families who were at risk of FI pre-pandemic. Mitigating pre-pandemic FI risk and providing stimulus payments and school meals may support children’s health and reduce disparities in response to pandemics.
Recent excavations by the Ancient Southwest Texas Project of Texas State University sampled a previously undocumented Younger Dryas component from Eagle Cave in the Lower Pecos Canyonlands of Texas. This stratified assemblage consists of bison (Bison antiquus) bones in association with lithic artifacts and a hearth. Bayesian modeling yields an age of 12,660–12,480 cal BP, and analyses indicate behaviors associated with the processing of a juvenile bison and the manufacture and maintenance of lithic tools. This article presents spatial, faunal, macrobotanical, chronometric, geoarchaeological, and lithic analyses relating to the Younger Dryas component within Eagle Cave. The identification of the Younger Dryas occupation in Eagle Cave should encourage archaeologists to revisit previously excavated rockshelter sites in the Lower Pecos and beyond to evaluate deposits for unrecognized, older occupations.
To review patient satisfaction with the change in practice towards telephone consultations during and after the coronavirus disease 2019 pandemic for head and neck cancer follow up.
Method
A retrospective analysis was conducted of head and neck cancer telephone appointments during a six-month period in a tertiary referral centre.
Results
Patients found the telephone consultations beneficial (98 per cent), with 30 per cent stating they were relieved to not have to attend hospital. Patients who travelled further, those with lower stage disease and patients with a greater interval from initial treatment were most satisfied with the telephone consultations. Sixty-eight per cent of patients stated they would be happy to have telephone consultations as part of their regular follow up after the pandemic.
Conclusion
Patients found the telephone consultations beneficial and 30 per cent considered them preferable to face-to-face appointments. This study demonstrates that telephone consultations can be used as an adjunct to face-to-face appointments in an effort to reduce hospital attendances whilst maintaining close follow up.