We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Understanding healthcare personnel’s (HCP) contact patterns are important to mitigate healthcare-associated infectious disease transmission. Little is known about how HCP contact patterns change over time or during outbreaks such as the COVID-19 pandemic.
Methods:
This study in a large United States healthcare system examined the social contact patterns of HCP via standardized social contact diaries. HCP were enrolled from October 2020 to June 2022. Participants completed monthly surveys of social contacts during a representative working day. In June 2022, participants completed a 2-day individual-level contact diary. Regression models estimated the association between contact rates and job type. We generated age-stratified contact matrices.
Results:
Three-hundred and sixty HCP enrolled, 157 completed one or more monthly contact diaries and 88 completed the intensive 2-day diary. In the monthly contact diaries, the median daily contacts were 15 (interquartile range (IQR) 8–20), this increased slightly during the study (slope-estimate 0.004, p-value 0.016). For individual-level contact diaries, 88 HCP reported 2,550 contacts over 2 days. HCP were 2.8 times more likely to contact other HCP (n = 1,592 contacts) than patients (n = 570 contacts). Rehabilitation/transport staff, diagnostic imaging technologists, doctors, nurses, mid-level, and laboratory personnel had higher contacts compared with the lowest contact group (Nursing aids). Contact matrices concentrated in working-age populations.
Conclusions:
HCP contacts concentrate in their work environment, primarily with other HCP. Their contacts remained stable over time even during large changes to societal contact patterns during the COVID-19 pandemic. This stability is critical for designing outbreak and pandemic responses.
Various water-based heater-cooler devices (HCDs) have been implicated in nontuberculous mycobacteria outbreaks. Ongoing rigorous surveillance for healthcare-associated M. abscessus (HA-Mab) put in place following a prior institutional outbreak of M. abscessus alerted investigators to a cluster of 3 extrapulmonary M. abscessus infections among patients who had undergone cardiothoracic surgery.
Methods:
Investigators convened a multidisciplinary team and launched a comprehensive investigation to identify potential sources of M. abscessus in the healthcare setting. Adherence to tap water avoidance protocols during patient care and HCD cleaning, disinfection, and maintenance practices were reviewed. Relevant environmental samples were obtained. Patient and environmental M. abscessus isolates were compared using multilocus-sequence typing and pulsed-field gel electrophoresis. Smoke testing was performed to evaluate the potential for aerosol generation and dispersion during HCD use. The entire HCD fleet was replaced to mitigate continued transmission.
Results:
Clinical presentations of case patients and epidemiologic data supported intraoperative acquisition. M. abscessus was isolated from HCDs used on patients and molecular comparison with patient isolates demonstrated clonality. Smoke testing simulated aerosolization of M. abscessus from HCDs during device operation. Because the HCD fleet was replaced, no additional extrapulmonary HA-Mab infections due to the unique clone identified in this cluster have been detected.
Conclusions:
Despite adhering to HCD cleaning and disinfection strategies beyond manufacturer instructions for use, HCDs became colonized with and ultimately transmitted M. abscessus to 3 patients. Design modifications to better contain aerosols or filter exhaust during device operation are needed to prevent NTM transmission events from water-based HCDs.
Background: The epidemiology of extended-spectrum cephalosporin-resistant Enterobacterales (ESCrE) in hospitalized patients in low- and middle-income countries (LMICs) is poorly described. Although risk factors for ESCrE clinical infection have been studied, little is known of the epidemiology of ESCrE colonization. Identifying risk factors for ESCrE colonization, which can predispose to infection, is therefore critical to inform antibiotic resistance reduction strategies. Methods: This study was conducted in 3 hospitals located in 3 districts in Botswana. In each hospital, we conducted ongoing surveillance in sequential units hospitalwide. All participants had rectal swabs collected which were inoculated onto chromogenic media followed by confirmatory testing using MALDI-TOF MS and VITEK-2. Data were collected via interview and review of the inpatient medical record on demographics, comorbidities, antibiotic use, healthcare exposures, invasive procedures, travel, animal contact, and food consumption. Participants with ESCrE colonization (cases) were compared to noncolonized participants (controls) using bivariable and multivariable analyses to identify risk factors for ESCrE colonization. Results: Enrollment occurred from January 15, 2020, to September 4, 2020, and 469 participants were enrolled. The median age was 42 years (IQR, 31–58) and 320 (68.2%) were female. The median time from hospital admission to date of sampling was 5 days (IQR, 3–12). There were 179 cases and 290 controls (ie, 38.2% of participants were ESCrE colonized). Independent risk factors for ESCrE colonization were a greater number of days on antibiotic, recent healthcare exposure, and tending swine prior to hospitalization. (Table). Conclusions: ESCrE colonization among hospitalized patients was common and was associated with several exposures. Our results suggest prior healthcare exposure may be important in driving ESCrE. The strong link to recent antibiotic use highlights the potential role of antibiotic stewardship interventions for prevention. The association with tending swine suggests that animal husbandry practices may play a role in community exposures, resulting in colonization detected at the time of hospital admission. These findings will help to inform future studies assessing strategies to curb further emergence of hospital ESCrE in LMICs.
The extent to which weed species vary in their ability to acquire and use different forms of nitrogen (N) (inorganic and organic) has not been investigated but could have important implications for weed survival and weed–crop competition in agroecosystems. We conducted a controlled environment experiment using stable isotopes to determine the uptake and partitioning of organic and inorganic N (amino acids, ammonium, and nitrate) by seven common weed and non-weed species. All species took up inorganic and organic N, including as intact amino acids. Concentrations of 15N derived from both ammonium and amino acids in shoot tissues were higher in large crabgrass [Digitaria sanguinalis (L.) Scop.] and barnyardgrass [Echinochloa crus-galli (L.) P. Beauv] than in common lambsquarters (Chenopodium album L.), redroot pigweed (Amaranthus retroflexus L.), and sorghum-sudangrass [Sorghum bicolor (L.) Moench × Sorghum bicolor (L.) ssp. drummondii (Nees ex Steud.) de Wet & Harlan]. In contrast, the concentration of 15N derived from nitrate was higher in wild mustard (Sinapis arvensis L.) shoots than in wild oat (Avena fatua L.) shoots. Root concentration of 15N derived from ammonium was lower in sorghum-sudangrass compared with other species, except for A. retroflexus and A. fatua, while root concentration of 15N derived from nitrate was lower in A. retroflexus compared with other species, except for C. album and S. arvensis. Discriminant analysis classified species based on their uptake and partitioning of all three labeled N forms. These results suggest that common agricultural weeds can access and use organic N and differentially take up inorganic N forms. Additional research is needed to determine whether species-specific differences in organic and inorganic N uptake influence the intensity of competition for soil N.
Many male prisoners have significant mental health problems, including anxiety and depression. High proportions struggle with homelessness and substance misuse.
Aims
This study aims to evaluate whether the Engager intervention improves mental health outcomes following release.
Method
The design is a parallel randomised superiority trial that was conducted in the North West and South West of England (ISRCTN11707331). Men serving a prison sentence of 2 years or less were individually allocated 1:1 to either the intervention (Engager plus usual care) or usual care alone. Engager included psychological and practical support in prison, on release and for 3–5 months in the community. The primary outcome was the Clinical Outcomes in Routine Evaluation Outcome Measure (CORE-OM), 6 months after release. Primary analysis compared groups based on intention-to-treat (ITT).
Results
In total, 280 men were randomised out of the 396 who were potentially eligible and agreed to participate; 105 did not meet the mental health inclusion criteria. There was no mean difference in the ITT complete case analysis between groups (92 in each arm) for change in the CORE-OM score (1.1, 95% CI –1.1 to 3.2, P = 0.325) or secondary analyses. There were no consistent clinically significant between-group differences for secondary outcomes. Full delivery was not achieved, with 77% (108/140) receiving community-based contact.
Conclusions
Engager is the first trial of a collaborative care intervention adapted for prison leavers. The intervention was not shown to be effective using standard outcome measures. Further testing of different support strategies for prison with mental health problems is needed.
Kinetoplastid parasites are responsible for both human and animal diseases across the globe where they have a great impact on health and economic well-being. Many species and life cycle stages are difficult to study due to limitations in isolation and culture, as well as to their existence as heterogeneous populations in hosts and vectors. Single-cell transcriptomics (scRNA-seq) has the capacity to overcome many of these difficulties, and can be leveraged to disentangle heterogeneous populations, highlight genes crucial for propagation through the life cycle, and enable detailed analysis of host–parasite interactions. Here, we provide a review of studies that have applied scRNA-seq to protozoan parasites so far. In addition, we provide an overview of sample preparation and technology choice considerations when planning scRNA-seq experiments, as well as challenges faced when analysing the large amounts of data generated. Finally, we highlight areas of kinetoplastid research that could benefit from scRNA-seq technologies.
The origin and stability of ground ice in the stable uplands of the McMurdo Dry Valleys remains poorly understood, with most studies focusing on the near-surface permafrost. The 2016 Friis Hills Drilling Project retrieved five cores reaching 50 m depth in mid-Miocene permafrost, a period when Antarctica transitioned to a hyper-arid environment. This study characterizes the cryostratigraphy of arguably the oldest permafrost on Earth and assesses 15 Myr of ground ice evolution using the REGO model. Four cryostratigraphic units were identified: 1) surficial dry permafrost (0–30 cm), 2) ice-rich to ice-poor permafrost (0.3–5.0 m) with high solute load and δ18O values (-16.2 ± 1.8‰) and low D-excess values (-65.6 ± 4.3‰), 3) near-dry permafrost (5–20 m) and 4) ice-poor to ice-rich permafrost (20–50 m) containing ice lenses with low solute load and δ18O values (-34.6 ± 1.2‰) and D-excess of 6.9 ± 2.6‰. The near-surface δ18O profile of ground ice is comparable to other sites in the stable uplands, suggesting that this ice is actively responding to changing surface environmental conditions and challenging the assumption that the surface has remained frozen for 13.8 Myr. The deep ice lenses probably originate from the freezing of meteoric water during the mid-Miocene, and their δ18O composition suggests mean annual air temperatures ~7–11°C warmer than today.
Cover crops are increasingly being used for weed management, and planting them as diverse mixtures has become an increasingly popular strategy for their implementation. While ecological theory suggests that cover crop mixtures should be more weed suppressive than cover crop monocultures, few experiments have explicitly tested this for more than a single temporal niche. We assessed the effects of cover crop mixtures (5- or 6-species and 14-species mixtures) and monocultures on weed abundance (weed biomass) and weed suppression at the time of cover crop termination. Separate experiments were conducted in Madbury, NH, from 2014 to 2017 for each of three temporal cover-cropping niches: summer (spring planting–summer termination), fall (summer planting–fall termination), and spring (fall planting–subsequent spring termination). Regardless of temporal niche, mixtures were never more weed suppressive than the most weed-suppressive cover crop grown as a monoculture, and the more diverse mixture (14 species) never outperformed the less diverse mixture. Mean weed-suppression levels of the best-performing monocultures in each temporal niche ranged from 97% to 98% for buckwheat (Fagopyrum esculentum Moench) in the summer niche and forage radish (Raphanus sativus L. var. niger J. Kern.) in the fall niche, and 83% to 100% for triticale (×Triticosecale Wittm. ex A. Camus [Secale × Triticum]) in the winter–spring niche. In comparison, weed-suppression levels for the mixtures ranged from 66% to 97%, 70% to 90%, and 67% to 99% in the summer, fall, and spring niches, respectively. Stability of weed suppression, measured as the coefficient of variation, was two to six times greater in the best-performing monoculture compared with the most stable mixture, depending on the temporal niche. Results of this study suggest that when weed suppression is the sole objective, farmers are more likely to achieve better results planting the most weed-suppressive cover crop as a monoculture than a mixture.
Edited by
Sue Clayton, Goldsmiths, University of London,Anna Gupta, Royal Holloway, University of London,Katie Willis, Royal Holloway, University of London
The unaccompanied child seeking asylum occupies an ambiguous space in political and legal discourses. The discourse of childhood often presents them as vulnerable victims of circumstances beyond their control, devoid of agency. On the other hand, in discourses around asylum and migration, terms such as ‘illegal migrant’ construct them as manipulative, potentially dangerous – the unknown and threatening ‘other’. As soon as they arrive in the UK, young asylum seekers find themselves caught between two domains of law that mirror those contrasting political discourses: the protective framework of care law provided by the Children Act 1989 and motivated by the principle of the child's welfare being paramount – ‘Every Child Matters’ – and the punitive framework of numerous Immigration Acts dominated by the call for ‘Securing our Borders: Controlling Migration’ (UK Border Agency, 2010). It is the latter that has shaped the legal framework governing determination of young people's asylum claims.
It has long been recognised that child asylum seekers are a particularly vulnerable sub-section of an already vulnerable population. Yet despite this recognition, child asylum seekers coming to the UK face particular obstacles in making their claims for asylum. This chapter will explore the recent developments in the UK's approach to children seeking asylum. This chapter will consider how the asylum system has in many cases failed to provide durable solutions for child refugees. It will question why, despite well-written guidelines and public awareness of the specific protection needs of children from conflict zones, the success rate for child asylum claims is often lower than for adults from the same country. It will then consider the legal obstacles that young asylum seekers face and the complexities of the system they are forced to navigate.
This chapter opens with a comparison between law in theory and law in practice as it applies to young asylum seekers. We highlight specific problematic issues: assessing credibility, the provision of legal aid, and delay in decision making. Two case studies then illustrate the uncertain journeys through the legal process faced by two of the largest groups of asylum-seeking young people: Afghans and Eritreans.
The law in theory
UK asylum law and guidance provide a number of protective measures for young asylum seekers.
High-residue cover crops can facilitate organic no-till vegetable production when cover crop biomass production is sufficient to suppress weeds (>8000 kg ha−1), and cash crop growth is not limited by soil temperature, nutrient availability, or cover crop regrowth. In cool climates, however, both cover crop biomass production and soil temperature can be limiting for organic no-till. In addition, successful termination of cover crops can be a challenge, particularly when cover crops are grown as mixtures. We tested whether reusable plastic tarps, an increasingly popular tool for small-scale vegetable farmers, could be used to augment organic no-till cover crop termination and weed suppression. We no-till transplanted cabbage into a winter rye (Secale cereale L.)-hairy vetch (Vicia villosa Roth) cover crop mulch that was terminated with either a roller-crimper alone or a roller-crimper plus black or clear tarps. Tarps were applied for durations of 2, 4 and 5 weeks. Across tarp durations, black tarps increased the mean cabbage head weight by 58% compared with the no tarp treatment. This was likely due to a combination of improved weed suppression and nutrient availability. Although soil nutrients and biological activity were not directly measured, remaining cover crop mulch in the black tarp treatments was reduced by more than 1100 kg ha−1 when tarps were removed compared with clear and no tarp treatments. We interpret this as an indirect measurement of biological activity perhaps accelerated by lower daily soil temperature fluctuations and more constant volumetric water content under black tarps. The edges of both tarp types were held down, rather than buried, but moisture losses from the clear tarps were greater and this may have affected the efficacy of clear tarps. Plastic tarps effectively killed the vetch cover crop, whereas it readily regrew in the crimped but uncovered plots. However, emergence of large and smooth crabgrass (Digitaria spp.) appeared to be enhanced in the clear tarp treatment. Although this experiment was limited to a single site-year in New Hampshire, it shows that use of black tarps can overcome some of the obstacles to implementing cover crop-based no-till vegetable productions in northern climates.
Method of levels (MOL) is an innovative transdiagnostic cognitive therapy with potential advantages over existing psychological treatments for psychosis.
Aims
The Next Level study is a feasibility randomised controlled trial (RCT) of MOL for people experiencing first-episode psychosis. It aims to determine the suitability of MOL for further testing in a definitive trial (trial registration ISRCTN13359355).
Method
The study uses a parallel group non-masked feasibilityRCT design with two conditions: (a) treatment as usual (TAU) and (b) TAU plus MOL. Participants (n = 36) were recruited from early intervention in psychosis services. Outcome measures are completed at baseline, 10 and 14 months. The primary outcomes are recruitment and retention.
Results
Participants’ demographic and clinical characteristics are presented along with baseline data.
Conclusions
Next Level has recruited to target, providing evidence that it is feasible to recruit to a RCT of MOL for first-episode psychosis.
Ice types, albedos and impurity content are characterized for the ablation zone of the Greenland ice sheet in Kronprinz Christians Land (80° N, 24° W). Along this ice margin the width of the ablation zone is only about 8 km. The emergence and melting of old ice in the ablation zone creates a surface layer of dust that was originally deposited with snowfall high on the ice sheet. This debris cover is augmented by locally derived wind-blown sediment. Subsequently, the surface dust particles often aggregate together to form centimetre-scale clumps that melt into the ice, creating cryoconite holes. The debris in the cryoconite holes becomes hidden from sunlight, raising the area-averaged albedo relative to surfaces with uniform debris cover. Spectral and broadband albedos were obtained for snow, ice hummocks, debris-covered ice, cryoconite-studded ice and barren tundra surfaces. Broadband ice albedos varied from 0.2 (for ice with heavy loading of uniform debris) to 0.6 (for ice hummocks with cryoconite holes). The cryoconite material itself has albedo 0.1 when wet. Areal distribution of the major surface types was estimated visually from a transect video as a function of distance from the ice edge (330 m a.s.l.). Ablation rates were measured along a transect from the ice margin to the slush zone 8 km from the margin (550 m a.s.l.), traversing both Pleistocene and Holocene ice. Ablation rates in early August averaged 2 cm d−1. Impurity concentrations were typically 4.3 mg L−1 in the subsurface ice. Surface concentrations were about 16 g m−2 on surfaces with low impurity loading, and heavily loaded surfaces had concentrations as high as 1.4 kg m−2. The mineralogical composition of the cryoconite material is comparable with that of the surrounding soils and with dust on a snowdrift in front of the ice margin, implying that much of the material is derived from local sources. A fine mode (clay) is present in the oldest ice but not in the nearby soil, suggesting that its origin is from wind deposition during Pleistocene glaciation.
The northern New England region includes the states of Vermont, New Hampshire, and Maine and encompasses a large degree of climate and edaphic variation across a relatively small spatial area, making it ideal for studying climate change impacts on agricultural weed communities. We sampled weed seedbanks and measured soil physical and chemical characteristics on 77 organic farms across the region and analyzed the relationships between weed community parameters and select geographic, climatic, and edaphic variables using multivariate procedures. Temperature-related variables (latitude, longitude, mean maximum and minimum temperature) were the strongest and most consistent correlates with weed seedbank composition. Edaphic variables were, for the most part, relatively weaker and inconsistent correlates with weed seedbanks. Our analyses also indicate that a number of agriculturally important weed species are associated with specific U.S. Department of Agriculture plant hardiness zones, implying that future changes in climate factors that result in geographic shifts in these zones will likely be accompanied by changes in the composition of weed communities and therefore new management challenges for farmers.
Background: The value of clients’ reports of their experiences in therapy is widely recognized, yet quantitative methodology has rarely been used to measure clients’ self-reported perceptions of what is helpful over a single session. Aims: A video-rating method using was developed to gather data at brief intervals using process measures of client perceived experience and standardized measures of working alliance (Session Rating Scale; SRS). Data were collected over the course of a single video-recorded session of cognitive therapy (Method of Levels Therapy; Carey, 2006; Mansell et al., 2012). We examined the acceptability and feasibility of the methodology and tested the concurrent validity of the measure by utilizing theory-led constructs. Method: Eighteen therapy sessions were video-recorded and clients each rated a 20-minute session of therapy at two-minute intervals using repeated measures. A multi-level analysis was used to test for correlations between perceived levels of helpfulness and client process variables. Results: The design proved to be feasible. Concurrent validity was borne out through high correlations between constructs. A multi-level regression examined the independent contributions of client process variables to client perceived helpfulness. Client perceived control (b = 0.39, 95% CI .05 to 0.73), the ability to talk freely (b = 0.30, SE = 0.11, 95% CI .09 to 0.51) and therapist approach (b = 0.31, SE = 0.14, 95% CI .04 to 0.57) predicted client-rated helpfulness. Conclusions: We identify a feasible and acceptable method for studying continuous measures of helpfulness and their psychological correlates during a single therapy session.
A contradiction has existed in the literature as to the conditions favoring formation of “ablation hollows” (“suncups”) on a melting snow surface. Some experiments find that these features grow under direct sunlight and decay in overcast, windy weather; whereas others find just the opposite result, that they grow best under cloudy, windy conditions and decay if exposed to direct sunlight. We find that the hidden variable in past experiments, which acts as a switch to determine which mode of formation can operate, is the absence or abundance of dark insoluble impurities in the snow. Direct sunlight causes ablation hollows to grow in clean snow and to decay in dirty snow (for dirt content below a critical value), because the dirt migrates to the ridges between the hollows, lowering the albedo at the ridges. By contrast, when ablation is dominated by turbulent heat exchange, the presence of dirt favours development of ablation hollows because the dirt migrates to the ridges and insulates them; albedo reduction has a negligible effect on ablation.
This hypothesis is supported by an experiment which showed that the presence of a thin layer of volcanic ash on the snow inhibited formation of ablation hollows under direct sunlight.
Observations of temperature maxima at about 10 cm depth in cold Antarctic snow during summer have previously been explained by proposing that solar heating is distributed with depth whereas thermal infrared cooling is localized at the surface (the “solid-state greenhouse”). An increase in temperature from the surface to 10 cm depth (ΔΤ ≈ 4 K) found by Rusin (1961) on the Antarctic Plateau was successfully reproduced by Schlatter (1972) in a combined radiative-transfer and heat-transfer model. However, when we improve the model’s spectral resolution, solving for solar radiative fluxes separately in 118 wavelength bands instead of just one “average” wavelength, ΔΤ shrinks to 0.2 Κ and moves toward the surface, indicating that the solid-state greenhouse is largely an artifact of inadequate spectral resolution. The agreement between Schlatter’s broad-band model and Rusin’s measurement suggests that the measurement is inaccurate, perhaps due to solar heating of the buried thermistors. Similar broad-band models which have been applied to the icy surface of Jupiter’s satellite Europa are also shown to overestimate the solid-state greenhouse by a factor of about 6.
The reason that the solid-state greenhouse effect is insignificant in the case of Antarctic snow is that the wavelengths which do penetrate deeply into snow (visible light) are essentially not absorbed and are scattered back to the surface, whereas the wavelengths that are absorbed by snow (near-infrared) are absorbed in the top few millimeters.
The conditions needed to obtain a significant solid-state greenhouse are examined. The phenomenon becomes important if the scattering coefficient is small (as in blue ice) or if the thermal conductivity is low (as in low-density snow, such as near-surface depth hoar).
To evaluate the potential of using surficial shell accumulations for paleoenvironmental studies, an extensive time series of individually dated specimens of the marine infaunal bivalve mollusk Semele casali was assembled using amino acid racemization (AAR) ratios (n = 270) calibrated against radiocarbon ages (n = 32). The shells were collected from surface sediments at multiple sites across a sediment-starved shelf in the shallow sub-tropical São Paulo Bight (São Paulo State, Brazil). The resulting 14C-calibrated AAR time series, one of the largest AAR datasets compiled to date, ranges from modern to 10,307 cal yr BP, is right skewed, and represents a remarkably complete time series: the completeness of the Holocene record is 66% at 250-yr binning resolution and 81% at 500-yr binning resolution. Extensive time-averaging is observed for all sites across the sampled bathymetric range indicating long water depth-invariant survival of carbonate shells at the sediment surface with low net sedimentation rates. Benthic organisms collected from active depositional surfaces can provide multi-millennial time series of biomineral records and serve as a source of geochemical proxy data for reconstructing environmental and climatic trends throughout the Holocene at centennial resolution. Surface sediments can contain time-rich shell accumulations that record the entire Holocene, not just the present.
Cover crops represent a potentially important biological filter during weed community assembly in agroecosystems. This filtering could be considered directional if different cover-crop species result in weed communities with predictably different species composition. We examined the following four questions related to the potential filtering effects of cover crops in a field experiment involving five cover crops grown in monoculture and mixture: (1) Do cover crops differ in their effect on weed community composition? (2) Is competition more intense between cover crops and weeds that are in the same family or functional group? (3) Is competition more intense across weed functional types in a cover-crop mixture compared with cover crops grown in monocultures? (4) Within a cover-crop mixture, is a higher seeding rate associated with more effective biotic filtering of the weed community? We found some evidence that cover crops differentially filtered weed communities and that at least some of these filtering effects were due to differential biomass production across cover-crop species. Monocultures of buckwheat and sorghum–sudangrass reduced the number of weed species relative to the no-cover-crop control by an average of 36 and 59% (buckwheat) and 25 and 40% (sorghum–sudangrass) in 2011 and 2012, respectively. We found little evidence that competition intensity was dependent upon the family or functional classification of the cover crop or weeds, or that cover-crop mixtures were stronger assembly filters than the most effective monocultures. Although our results do not suggest that annual cover crops exert strong directional filtering during weed community assembly, our methodological framework for detecting such effects could be applied to similar future studies that incorporate a greater number of cover-crop species and are conducted under a greater range of cover-cropping conditions.