We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Among Líĺwat people of the Interior Plateau of British Columbia, an oral tradition relays how early ancestors used to ascend Qẃelqẃelústen, or Mount Meager. The account maintains that those climbers could see the ocean, which is not the case today, because the mountain is surrounded by many other high peaks, and the Strait of Georgia is several mountain ridges to the west. However, the mountain is an active and volatile volcano, which last erupted circa 2360 cal BP. It is also the site of the largest landslide in Canadian history, which occurred in 2010. Given that it had been a high, glacier-capped mountain throughout the Holocene, much like other volcanoes along the coastal range, we surmise that a climber may have reasonably been afforded a view of the ocean from its prior heights. We conducted viewshed analyses of the potential mountain height prior to its eruption and determined that one could indeed view the ocean if the mountain were at least 950 m higher than it is today. This aligns with the oral tradition, indicating that it may be over 2,400 years old, and plausibly in the range of 4,000 to 9,000 years old when the mountain may have been at such a height.
Sperlingite, (H2O)K(Mn2+Fe3+)(Al2Ti)(PO4)4[O(OH)][(H2O)9(OH)]⋅4H2O, is a new monoclinic member of the paulkerrite group, from the Hagendorf-Süd pegmatite, Oberpfalz, Bavaria, Germany. It was found in corrosion pits of altered zwieselite, in association with columbite, hopeite, leucophosphite, mitridatite, scholzite, orange–brown zincoberaunite sprays and tiny green crystals of zincolibethenite. Sperlingite forms colourless prisms with pyramidal terminations, which are predominantly only 5 to 20 μm in size, rarely to 60 μm and frequently are multiply intergrown and are overgrown with smaller crystals. The crystals are flattened on {010} and slightly elongated along [100] with forms {010}, {001} and {111}. Twinning occurs by rotation about c. The calculated density is 2.40 g⋅cm–3. Optically, sperlingite crystals are biaxial (+), α = 1.600(est), β = 1.615(5), γ = 1.635(5) (white light) and 2V (calc.) = 82.7°. The optical orientation is X = b, Y = c and Z = a. Neither dispersion nor pleochroism were observed. The empirical formula from electron microprobe analyses and structure refinement is A1[(H2O)0.96K0.04]Σ1.00A2(K0.52□0.48)Σ1.00M1(Mn2+0.60Mg0.33Zn0.29Fe3+0.77)Σ1.99M2+M3(Al1.05Ti4+1.33Fe3+0.62)Σ3.00(PO4)4X[F0.19(OH)0.94O0.87]Σ2.00[(H2O)9.23(OH)0.77]Σ10.00⋅3.96H2O. Sperlingite has monoclinic symmetry with space group P21/c and unit-cell parameters a = 10.428(2) Å, b = 20.281(4) Å, c = 12.223(2) Å, β = 90.10(3)°, V = 2585.0(8) Å3 and Z = 4. The crystal structure was refined using synchrotron single-crystal data to wRobs = 0.058 for 5608 reflections with I > 3σ(I). Sperlingite is the first paulkerrite-group mineral to have co-dominant divalent and trivalent cations at the M1 sites; All other reported members have Mn2+ or Mg dominant at M1. Local charge balance for Fe3+ at M1 is achieved by H2O → OH– at H2O coordinated to M1.
To characterise transesophageal echocardiography practice patterns among paediatric cardiac surgical centres in the United States and Canada.
Methods:
A 42-question survey was sent to 80 echocardiography laboratory directors at paediatric cardiology centres with surgical programmes in the United States and Canada. Question domains included transesophageal echocardiography centre characteristics, performance and reporting, equipment use, trainee participation, and quality assurance.
Results:
Fifty of the 80 centres (62.5%) responded to the survey. Most settings were academic (86.0%) with 42.0% of centres performing > 350 surgical cases/year. The median number of transesophageal echocardiograms performed/cardiologist/year was 50 (26, 73). Pre-operative transesophageal echocardiography was performed in most surgical cases by 91.7% of centres. Transesophageal echocardiography was always performed by most centres following Norwood, Glenn, and Fontan procedures and by < 10% of centres following coarctation repair. Many centres with a written guideline allowed transesophageal echocardiography transducer use at weights below manufacturer recommendations (50.0 and 61.1% for neonatal and paediatric transducers, respectively). Most centres (36/37, 97.3%) with categorical fellowships had rotations which included transesophageal echocardiography participation. Large surgical centres (>350 cases/year) had higher median number of transesophageal echocardiograms/cardiologist/year (75.5 [53, 86] versus 35 [20, 52], p < 0.001) and more frequently used anaesthesia for diagnostic transesophageal echocardiography ≥ 67% of time (100.0 versus 62.1%, p = 0.001).
Conclusions:
There is significant variability in transesophageal echocardiography practice patterns and training requirements among paediatric cardiology centres in the United States and Canada. Findings may help inform programmatic decisions regarding transesophageal echocardiography expectations, performance and reporting, equipment use, trainee involvement, and quality assurance.
Precision Medicine is an emerging approach for disease treatment and prevention that takes into account individual variability in genes, environment, and lifestyle. Autoimmune diseases are those in which the body’s natural defense system loses discriminating power between its own cells and foreign cells, causing the body to mistakenly attack healthy tissues. These conditions are very heterogeneous in their presentation and therefore difficult to diagnose and treat. Achieving precision medicine in autoimmune diseases has been challenging due to the complex etiologies of these conditions, involving an interplay between genetic, epigenetic, and environmental factors. However, recent technological and computational advances in molecular profiling have helped identify patient subtypes and molecular pathways which can be used to improve diagnostics and therapeutics. This review discusses the current understanding of the disease mechanisms, heterogeneity, and pathogenic autoantigens in autoimmune diseases gained from genomic and transcriptomic studies and highlights how these findings can be applied to better understand disease heterogeneity in the context of disease diagnostics and therapeutics.
To describe the genomic analysis and epidemiologic response related to a slow and prolonged methicillin-resistant Staphylococcus aureus (MRSA) outbreak.
Design:
Prospective observational study.
Setting:
Neonatal intensive care unit (NICU).
Methods:
We conducted an epidemiologic investigation of a NICU MRSA outbreak involving serial baby and staff screening to identify opportunities for decolonization. Whole-genome sequencing was performed on MRSA isolates.
Results:
A NICU with excellent hand hygiene compliance and longstanding minimal healthcare-associated infections experienced an MRSA outbreak involving 15 babies and 6 healthcare personnel (HCP). In total, 12 cases occurred slowly over a 1-year period (mean, 30.7 days apart) followed by 3 additional cases 7 months later. Multiple progressive infection prevention interventions were implemented, including contact precautions and cohorting of MRSA-positive babies, hand hygiene observers, enhanced environmental cleaning, screening of babies and staff, and decolonization of carriers. Only decolonization of HCP found to be persistent carriers of MRSA was successful in stopping transmission and ending the outbreak. Genomic analyses identified bidirectional transmission between babies and HCP during the outbreak.
Conclusions:
In comparison to fast outbreaks, outbreaks that are “slow and sustained” may be more common to units with strong existing infection prevention practices such that a series of breaches have to align to result in a case. We identified a slow outbreak that persisted among staff and babies and was only stopped by identifying and decolonizing persistent MRSA carriage among staff. A repeated decolonization regimen was successful in allowing previously persistent carriers to safely continue work duties.
One in six nursing home residents and staff with positive SARS-CoV-2 tests ≥90 days after initial infection had specimen cycle thresholds (Ct) <30. Individuals with specimen Ct<30 were more likely to report symptoms but were not different from individuals with high Ct value specimens by other clinical and testing data.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Introduced mammalian predators are responsible for the decline and extinction of many native species, with rats (genus Rattus) being among the most widespread and damaging invaders worldwide. In a naturally fragmented landscape, we demonstrate the multi-year effectiveness of snap traps in the removal of Rattus rattus and Rattus exulans from lava-surrounded forest fragments ranging in size from <0.1 to >10 ha. Relative to other studies, we observed low levels of fragment recolonization. Larger rats were the first to be trapped, with the average size of trapped rats decreasing over time. Rat removal led to distinct shifts in the foraging height and location of mongooses and mice, emphasizing the need to focus control efforts on multiple invasive species at once. Furthermore, because of a specially designed trap casing, we observed low non-target capture rates, suggesting that on Hawai‘i and similar islands lacking native rodents the risk of killing non-target species in snap traps may be lower than the application of rodenticides, which have the potential to contaminate food webs. These efforts demonstrate that targeted snap-trapping is an effective removal method for invasive rats in fragmented habitats and that, where used, monitoring of recolonization should be included as part of a comprehensive biodiversity management strategy.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
Methods
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Results
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Conclusions
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
Design:
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
Methods:
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Results:
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
Conclusions:
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
Purple nutsedge (Cyperus rotundus L.) and yellow nutsedge (C. esculentus L.) appeared to be equally acceptable for oviposition by caged Bactra verutana Zeller, but purple nutsedge was significantly more suitable as a host: 90% of the larvae survived to maturity on purple nutsedge compared with 65% on yellow nutsedge. Responses of the plant species to both larval feeding injury and plant density were similar but purple nutsedge tended to be injured more than yellow nutsedge. At a high shoot density (nine shoots per pot), production of tubers by purple nutsedge was more adversely affected by feeding of five larvae per shoot than was production by yellow nutsedge: tuber dry weights were reduced 93 and 80% and numbers of tubers per pot were reduced 77 and 62%, respectively. Production of inflorescences was greatly reduced in both species. The effect of B. verutana on inflorescences may be more important for yellow nutsedge, which is generally considered to reproduce freely by seeds. Both species of nutsedge probably would be about equally affected by augmentation of B. verutana populations as a method of biological control.
Several herbicide-based weed management programs for glyphosate-tolerant cotton were compared in eight field studies across Alabama during 1996 and 1997. Weed management programs ranged from traditional, soil-applied residual herbicide programs to more recently developed total postemergence (POST) herbicide programs. Pitted morningglory and sicklepod control was best achieved with fluometuron applied preemergence (PRE) followed by (fb) a single POST over-the-top (POT) application of glyphosate fb a POST-directed application of glyphosate. Annual grass control was better with the preplant incorporated (PPI) programs at two of three locations in both years. Treatments that included at least one glyphosate POT application gave increased grass control over no glyphosate or pyrithiobac POT. Velvetleaf control was improved with the addition of glyphosate POT. A herbicide program using no POST herbicides yielded significantly less seed cotton than any program using POST herbicides at one location. PRE- and POST-only weed management programs at another location produced more seed cotton and gave greater net returns than PPI programs. Similarly, net returns at that same location were equivalent for both PRE- and POST-only programs, and less for PPI programs. POST-only programs yielded highest amounts of seed cotton and netted greater returns.
A telephone survey was conducted with growers in Iowa, Illinois, Indiana, Nebraska, Mississippi, and North Carolina to discern the utilization of the glyphosate-resistant (GR) trait in crop rotations, weed pressure, tillage practices, herbicide use, and perception of GR weeds. This paper focuses on survey results regarding herbicide decisions made during the 2005 cropping season. Less than 20% of the respondents made fall herbicide applications. The most frequently used herbicides for fall applications were 2,4-D and glyphosate, and these herbicides were also the most frequently used for preplant burndown weed control in the spring. Atrazine and acetochlor were frequently used in rotations containing GR corn. As expected, crop rotations using a GR crop had a high percentage of respondents that made one to three POST applications of glyphosate per year. GR corn, GR cotton, and non-GR crops had the highest percentage of growers applying non-glyphosate herbicides during the 2005 growing season. A crop rotation containing GR soybean had the greatest negative impact on non-glyphosate use. Overall, glyphosate use has continued to increase, with concomitant decreases in utilization of other herbicides.
Corn and soybean growers in Illinois, Indiana, Iowa, Mississippi, Nebraska, and North Carolina, as well as cotton growers in Mississippi and North Carolina, were surveyed about their views on changes in problematic weeds and weed pressure in cropping systems based on a glyphosate-resistant (GR) crop. No growers using a GR cropping system for more than 5 yr reported heavy weed pressure. Over all cropping systems investigated (continuous GR soybean, continuous GR cotton, GR corn/GR soybean, GR soybean/non-GR crop, and GR corn/non-GR crop), 0 to 7% of survey respondents reported greater weed pressure after implementing rotations using GR crops, whereas 31 to 57% felt weed pressure was similar and 36 to 70% indicated that weed pressure was less. Pigweed, morningglory, johnsongrass, ragweed, foxtail, and velvetleaf were mentioned as their most problematic weeds, depending on the state and cropping system. Systems using GR crops improved weed management compared with the technologies used before the adoption of GR crops. However, the long-term success of managing problematic weeds in GR cropping systems will require the development of multifaceted integrated weed management programs that include glyphosate as well as other weed management tactics.
A phone survey was administered to 1,195 growers in six states (Illinois, Indiana, Iowa, Mississippi, Nebraska, and North Carolina). The survey measured producers' crop history, perception of glyphosate-resistant (GR) weeds, past and present weed pressure, tillage practices, and herbicide use as affected by the adoption of GR crops. This article describes the changes in tillage practice reported in the survey. The adoption of a GR cropping system resulted in a large increase in the percentage of growers using no-till and reduced-till systems. Tillage intensity declined more in continuous GR cotton and GR soybean (45 and 23%, respectively) than in rotations that included GR corn or non-GR crops. Tillage intensity declined more in the states of Mississippi and North Carolina than in the other states, with 33% of the growers in these states shifting to more conservative tillage practices after the adoption of a GR crop. This was primarily due to the lower amount of conservation tillage adoption in these states before GR crop availability. Adoption rates of no-till and reduced-till systems increased as farm size decreased. Overall, producers in a crop rotation that included a GR crop shifted from a relatively more tillage-intense system to reduced-till or no-till systems after implementing a GR crop into their production system.
Over 175 growers in each of six states (Illinois, Indiana, Iowa, Mississippi, Nebraska, and North Carolina) were surveyed by telephone to assess their perceptions of the benefits of utilizing the glyphosate-resistant (GR) crop trait in corn, cotton, and soybean. The survey was also used to determine the weed management challenges growers were facing after using this trait for a minimum of 4 yr. This survey allowed the development of baseline information on how weed management and crop production practices have changed since the introduction of the trait. It provided useful information on common weed management issues that should be addressed through applied research and extension efforts. The survey also allowed an assessment of the perceived levels of concern among growers about glyphosate resistance in weeds and whether they believed they had experienced glyphosate resistance on their farms. Across the six states surveyed, producers reported 38, 97, and 96% of their corn, cotton, and soybean hectarage planted in a GR cultivar. The most widely adopted GR cropping system was a GR soybean/non-GR crop rotation system; second most common was a GR soybean/GR corn crop rotation system. The non-GR crop component varied widely, with the most common crops being non-GR corn or rice. A large range in farm size for the respondents was observed, with North Carolina having the smallest farms in all three crops. A large majority of corn and soybean growers reported using some type of crop rotation system, whereas very few cotton growers rotated out of cotton. Overall, rotations were much more common in Midwestern states than in Southern states. This is important information as weed scientists assist growers in developing and using best management practices to minimize the development of glyphosate resistance.