We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Aviation passenger screening has been used worldwide to mitigate the translocation risk of SARS-CoV-2. We present a model that evaluates factors in screening strategies used in air travel and assess their relative sensitivity and importance in identifying infectious passengers. We use adapted Monte Carlo simulations to produce hypothetical disease timelines for the Omicron variant of SARS-CoV-2 for travelling passengers. Screening strategy factors assessed include having one or two RT-PCR and/or antigen tests prior to departure and/or post-arrival, and quarantine length and compliance upon arrival. One or more post-arrival tests and high quarantine compliance were the most important factors in reducing pathogen translocation. Screening that combines quarantine and post-arrival testing can shorten the length of quarantine for travelers, and variability and mean testing sensitivity in post-arrival RT-PCR and antigen tests decrease and increase with the greater time between the first and second post-arrival test, respectively. This study provides insight into the role various screening strategy factors have in preventing the translocation of infectious diseases and a flexible framework adaptable to other existing or emerging diseases. Such findings may help in public health policy and decision-making in present and future evidence-based practices for passenger screening and pandemic preparedness.
Precision Medicine is an emerging approach for disease treatment and prevention that takes into account individual variability in genes, environment, and lifestyle. Autoimmune diseases are those in which the body’s natural defense system loses discriminating power between its own cells and foreign cells, causing the body to mistakenly attack healthy tissues. These conditions are very heterogeneous in their presentation and therefore difficult to diagnose and treat. Achieving precision medicine in autoimmune diseases has been challenging due to the complex etiologies of these conditions, involving an interplay between genetic, epigenetic, and environmental factors. However, recent technological and computational advances in molecular profiling have helped identify patient subtypes and molecular pathways which can be used to improve diagnostics and therapeutics. This review discusses the current understanding of the disease mechanisms, heterogeneity, and pathogenic autoantigens in autoimmune diseases gained from genomic and transcriptomic studies and highlights how these findings can be applied to better understand disease heterogeneity in the context of disease diagnostics and therapeutics.
Serial position scores on verbal memory tests are sensitive to early Alzheimer’s disease (AD)-related neuropathological changes that occur in the entorhinal cortex and hippocampus. The current study examines longitudinal change in serial position scores as markers of subtle cognitive decline in older adults who may be in preclinical or at-risk states for AD.
Methods:
This study uses longitudinal data from the Religious Orders Study and the Rush Memory and Aging Project. Participants (n = 141) were included if they did not have dementia at enrollment, completed follow-up assessments, and died and were classified as Braak stage I or II. Memory tests were used to calculate serial position (primacy, recency), total recall, and episodic memory composite scores. A neuropathological evaluation quantified AD, vascular, and Lewy body pathologies. Mixed effects models were used to examine change in memory scores. Neuropathologies and covariates (age, sex, education, APOE e4) were examined as moderators.
Results:
Primacy scores declined (β = −.032, p < .001), whereas recency scores increased (β = .021, p = .012). No change was observed in standard memory measures. Greater neurofibrillary tangle density and atherosclerosis explained 10.4% of the variance in primacy decline. Neuropathologies were not associated with recency change.
Conclusions:
In older adults with hippocampal neuropathologies, primacy score decline may be a sensitive marker of early AD-related changes. Tangle density and atherosclerosis had additive effects on decline. Recency improvement may reflect a compensatory mechanism. Monitoring for changes in serial position scores may be a useful in vivo method of tracking incipient AD.
To describe the genomic analysis and epidemiologic response related to a slow and prolonged methicillin-resistant Staphylococcus aureus (MRSA) outbreak.
Design:
Prospective observational study.
Setting:
Neonatal intensive care unit (NICU).
Methods:
We conducted an epidemiologic investigation of a NICU MRSA outbreak involving serial baby and staff screening to identify opportunities for decolonization. Whole-genome sequencing was performed on MRSA isolates.
Results:
A NICU with excellent hand hygiene compliance and longstanding minimal healthcare-associated infections experienced an MRSA outbreak involving 15 babies and 6 healthcare personnel (HCP). In total, 12 cases occurred slowly over a 1-year period (mean, 30.7 days apart) followed by 3 additional cases 7 months later. Multiple progressive infection prevention interventions were implemented, including contact precautions and cohorting of MRSA-positive babies, hand hygiene observers, enhanced environmental cleaning, screening of babies and staff, and decolonization of carriers. Only decolonization of HCP found to be persistent carriers of MRSA was successful in stopping transmission and ending the outbreak. Genomic analyses identified bidirectional transmission between babies and HCP during the outbreak.
Conclusions:
In comparison to fast outbreaks, outbreaks that are “slow and sustained” may be more common to units with strong existing infection prevention practices such that a series of breaches have to align to result in a case. We identified a slow outbreak that persisted among staff and babies and was only stopped by identifying and decolonizing persistent MRSA carriage among staff. A repeated decolonization regimen was successful in allowing previously persistent carriers to safely continue work duties.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Introduced mammalian predators are responsible for the decline and extinction of many native species, with rats (genus Rattus) being among the most widespread and damaging invaders worldwide. In a naturally fragmented landscape, we demonstrate the multi-year effectiveness of snap traps in the removal of Rattus rattus and Rattus exulans from lava-surrounded forest fragments ranging in size from <0.1 to >10 ha. Relative to other studies, we observed low levels of fragment recolonization. Larger rats were the first to be trapped, with the average size of trapped rats decreasing over time. Rat removal led to distinct shifts in the foraging height and location of mongooses and mice, emphasizing the need to focus control efforts on multiple invasive species at once. Furthermore, because of a specially designed trap casing, we observed low non-target capture rates, suggesting that on Hawai‘i and similar islands lacking native rodents the risk of killing non-target species in snap traps may be lower than the application of rodenticides, which have the potential to contaminate food webs. These efforts demonstrate that targeted snap-trapping is an effective removal method for invasive rats in fragmented habitats and that, where used, monitoring of recolonization should be included as part of a comprehensive biodiversity management strategy.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
Design:
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
Methods:
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Results:
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
Conclusions:
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
Though theory suggests that individual differences in neuroticism (a tendency to experience negative emotions) would be associated with altered functioning of the amygdala (which has been linked with emotionality and emotion dysregulation in childhood, adolescence, and adulthood), results of functional neuroimaging studies have been contradictory and inconclusive. We aimed to clarify the relationship between neuroticism and three hypothesized neural markers derived from functional magnetic resonance imaging during negative emotion face processing: amygdala activation, amygdala habituation, and amygdala-prefrontal connectivity, each of which plays an important role in the experience and regulation of emotions. We used general linear models to examine the relationship between trait neuroticism and the hypothesized neural markers in a large sample of over 500 young adults. Although neuroticism was not significantly associated with magnitude of amygdala activation or amygdala habituation, it was associated with amygdala–ventromedial prefrontal cortex connectivity, which has been implicated in emotion regulation. Results suggest that trait neuroticism may represent a failure in top-down control and regulation of emotional reactions, rather than overactive emotion generation processes, per se. These findings suggest that neuroticism, which has been associated with increased rates of transdiagnostic psychopathology, may represent a failure in the inhibitory neurocircuitry associated with emotion regulation.
OBJECTIVES/SPECIFIC AIMS: Rodent models can be used to study neonatal abstinence syndrome (NAS), but the applicability of findings from the models to NAS in humans is not well understood. The objective of this study was to develop a rat model of norbuprenorphine-induced NAS and validate its translational value by comparing blood concentrations in the norbuprenorphine-treated pregnant rat to those previously reported in pregnant women undergoing buprenorphine treatment. METHODS/STUDY POPULATION: Pregnant Long-Evans rats were implanted with 14-day osmotic minipumps containing vehicle, morphine (positive control), or norbuprenorphine (0.3–3 mg/kg/d) on gestation day 9. Within 12 hours of delivery, pups were tested for spontaneous or precipitated opioid withdrawal by injecting them with saline (10 mL/kg, i.p.) or naltrexone (1 or 10 mg/kg, i.p), respectively, and observing them for well-validated neonatal withdrawal signs. Blood was sampled via indwelling jugular catheters from a subset of norbuprenorphine-treated dams on gestation day 8, 10, 13, 17, and 20. Norbuprenorphine concentrations in whole blood samples were quantified using LC/MS/MS. RESULTS/ANTICIPATED RESULTS: Blood concentrations of norbuprenorphine in rats exposed to 1–3 mg/kg/d of norbuprenorphine were similar to levels previously reported in pregnant women undergoing buprenorphine treatment. Pups born to dams treated with these doses exhibited robust withdrawal signs. Blood concentrations of norbuprenorphine decreased across gestation, which is similar to previous reports in humans. DISCUSSION/SIGNIFICANCE OF IMPACT: These results suggest that dosing dams with 1–3 mg/kg/day norbuprenorphine produces maternal blood concentrations and withdrawal severity similar to those previously reported in humans. This provides evidence that, at these doses, this model is useful for testing hypotheses about norbuprenorphine that are applicable to NAS in humans.
To examine how the introduction of intensive community support (ICS) affected admissions to community hospital (CH) and to explore the views of patients, carers and health professionals on this transition.
Background
ICS was introduced to provide an alternative to CH provision for patients (mostly very elderly) requiring general rehabilitation.
Method
Routine data from both services were analysed to identify the number of admissions and length of stay between September 2012 and September 2014. In total, 10 patients took part in qualitative interviews. Qualitative interviews and focus groups were undertaken with 19 staff members, including managers and clinicians.
Findings
There were 5653 admissions to CH and 1710 to ICS between September 2012 and September 2014. In the five months before the introduction of ICS, admission rates to CH were on average 217/month; in the final five months of the study, when both services were fully operational, average numbers of patients admitted were: CH 162 (a 25% reduction), ICS 97, total 259 (a 19% increase). Patients and carers rated both ICS and CH favourably compared with acute hospital care. Those who had experienced both services felt each to be appropriate at the time; they appreciated the 24 h availability of staff in CH when they were more dependent, and the convenience of being at home after they had improved. In general, staff welcomed the introduction of ICS and appreciated the advantages of home-based rehabilitation. Managers had a clearer vision of ICS than staff on the ground, some of whom felt underprepared to work in the community. There was a consensus that ICS was managing less complex and dependent patients than had been envisaged.
Conclusion
ICS can provide a feasible adjunct to CH that is acceptable to patients. More work is needed to promote the vision of ICS amongst staff in both community and acute sectors.
The Chemical Movement through Layered Soils (CMLS) model was modified and combined with the USDA-SCS State Soil Geographic Data Base (STATSGO) and Montana Agricultural Potentials System (MAPS) digital databases to assess the likelihood of groundwater contamination from selected herbicides in Teton County, MT. The STATSGO and MAPS databases were overlaid to produce polygons with unique soil and climate characteristics and attribute tables containing only those data needed by the CMLS model. The Weather Generator (WGEN) computer simulation model was modified and used to generate daily precipitation and evapotranspiration values. A new algorithm was developed to estimate soil carbon as a function of soil depth. The depth of movement of the applied chemicals at the end of the growing season was estimated with CMLS for each of the soil series in the STATSGO soil mapping units and the results were entered into ARC/INFO to produce the final hazard maps showing best, weighted average, and worst case results for every unique combination (polygon) of soil mapping unit and climate. County weed infestation maps for leafy spurge and spotted knapweed were digitized and overlaid in ARC/INFO with the CMLS model results for picloram to illustrate how the results might be used to evaluate the threat to groundwater posed by current herbicide applications.
Growing enough cover crop biomass to adequately suppress weeds is one of the primary challenges in reduced-tillage systems that rely on mulch-based weed suppression. We investigated two approaches to increasing cereal rye biomass for improved weed suppression: (1) increasing soil fertility and (2) increasing cereal rye seeding rate. We conducted a factorial experiment with three poultry litter application rates (0, 80, and 160 kg N ha−1) and three rye seeding rates (90, 150, and 210 kg seed ha−1) in Pennsylvania and Maryland in 2008 and 2009. We quantified rye biomass immediately after mechanically terminating it with a roller and weed biomass at 10 wk after termination (WAT). Rye biomass increased with poultry litter applications (675, 768, and 787 g m−2 in the 0, 80, and 160 kg N ha−1 treatments, respectively), but this increased rye biomass did not decrease weed biomass. In contrast, increasing rye seeding rate did not increase rye biomass, but it did reduce weed biomass (328, 279, and 225 g m−2 in the 90, 150, and 210 kg seed ha−1 treatments, respectively). In 2009, we also sampled ground cover before rolling and weed biomass and density at 4 WAT. Despite no treatment effects, we found a correlation between bare soil before rolling (%) and weed biomass at 4 WAT. Our results suggest that increased rye seeding rate can effectively reduce weed biomass and that ground cover in early spring can influence weed biomass later in the growing season.
During 1990 we surveyed the southern sky using a multi-beam receiver at frequencies of 4850 and 843 MHz. The half-power beamwidths were 4 and 25 arcmin respectively. The finished surveys cover the declination range between +10 and −90 degrees declination, essentially complete in right ascension, an area of 7.30 steradians. Preliminary analysis of the 4850 MHz data indicates that we will achieve a five sigma flux density limit of about 30 mJy. We estimate that we will find between 80 000 and 90 000 new sources above this limit. This is a revised version of the paper presented at the Regional Meeting by the first four authors; the surveys now have been completed.
The 2013 multistate outbreaks contributed to the largest annual number of reported US cases of cyclosporiasis since 1997. In this paper we focus on investigations in Texas. We defined an outbreak-associated case as laboratory-confirmed cyclosporiasis in a person with illness onset between 1 June and 31 August 2013, with no history of international travel in the previous 14 days. Epidemiological, environmental, and traceback investigations were conducted. Of the 631 cases reported in the multistate outbreaks, Texas reported the greatest number of cases, 270 (43%). More than 70 clusters were identified in Texas, four of which were further investigated. One restaurant-associated cluster of 25 case-patients was selected for a case-control study. Consumption of cilantro was most strongly associated with illness on meal date-matched analysis (matched odds ratio 19·8, 95% confidence interval 4·0–∞). All case-patients in the other three clusters investigated also ate cilantro. Traceback investigations converged on three suppliers in Puebla, Mexico. Cilantro was the vehicle of infection in the four clusters investigated; the temporal association of these clusters with the large overall increase in cyclosporiasis cases in Texas suggests cilantro was the vehicle of infection for many other cases. However, the paucity of epidemiological and traceback information does not allow for a conclusive determination; moreover, molecular epidemiological tools for cyclosporiasis that could provide more definitive linkage between case clusters are needed.
We conducted a time-series analysis to evaluate the impact of the ASP over a 6.25-year period (July 1, 2008–September 30, 2014) while controlling for trends during a 3-year preintervention period (July 1, 2005–June 30, 2008). The primary outcome measures were total antibacterial and antipseudomonal use in days of therapy (DOT) per 1,000 patient-days (PD). Secondary outcomes included antimicrobial costs and resistance, hospital-onset Clostridium difficile infection, and other patient-centered measures.
RESULTS
During the preintervention period, total antibacterial and antipseudomonal use were declining (−9.2 and −5.5 DOT/1,000 PD per quarter, respectively). During the stewardship period, both continued to decline, although at lower rates (−3.7 and −2.2 DOT/1,000 PD, respectively), resulting in a slope change of 5.5 DOT/1,000 PD per quarter for total antibacterial use (P=.10) and 3.3 DOT/1,000 PD per quarter for antipseudomonal use (P=.01). Antibiotic expenditures declined markedly during the stewardship period (−$295.42/1,000 PD per quarter, P=.002). There were variable changes in antimicrobial resistance and few apparent changes in C. difficile infection and other patient-centered outcomes.
CONCLUSION
In a hospital with low baseline antibiotic use, implementation of an ASP was associated with sustained reductions in total antibacterial and antipseudomonal use and declining antibiotic expenditures. Common ASP outcome measures have limitations.