We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Explosive ordnance (EO) and explosive weapons (EW) inflict significant suffering on civilian populations in conflict and post-conflict settings. At present, there is limited coordination between humanitarian mine action (HMA) and emergency care for civilian victims of EO/EW. Key informant interviews with sector experts were conducted to evaluate strategies for enhanced engagement between HMA and emergency care capacity-building in EO/EW-affected settings.
Methods
A cross-sectional qualitative study was conducted to interview HMA and health sector experts. Data were analyzed in Dedoose using deductive and inductive coding methods.
Results
Nineteen key informants were interviewed representing sector experts in HMA, health, and policy domains intersecting with the care of EO/EW casualties. Recommendations included integration of layperson first responder trainings with EO risk education, development of prehospital casualty notification systems with standardized health facility capacity mapping, and refresher trainings for HMA medics at local health facilities.
Conclusions
Medical capabilities within the HMA sector hold potential to strengthen emergency care for civilian EO/EW casualties yet in the absence of structured coordination strategies is underutilized for this purpose. Increased HMA engagement in emergency care may enhance implementation of evidence-based emergency care interventions to decrease preventable death and disability among civilian victims of EO/EW in low-resource settings.
Humanitarian mine action (HMA) stakeholders have an organized presence with well-resourced medical capability in many conflict and post-conflict settings. Humanitarian mine action has the potential to positively augment local trauma care capacity for civilian casualties of explosive ordnance (EO) and explosive weapons (EWs). Yet at present, few strategies exist for coordinated engagement between HMA and the health sector to support emergency care system strengthening to improve outcomes among EO/EW casualties.
Methods:
A scoping literature review was conducted to identify records that described trauma care interventions pertinent to civilian casualties of EO/EW in resource-constrained settings using structured searches of indexed databases and grey literature. A 2017 World Health Organization (WHO) review on trauma systems components in low- and middle-income countries (LMICs) was updated with additional eligible reports describing trauma care interventions in LMICs or among civilian casualties of EO/EWs after 2001.
Results:
A total of 14,195 non-duplicative records were retrieved, of which 48 reports met eligibility criteria. Seventy-four reports from the 2017 WHO review and 16 reports identified from reference lists yielded 138 reports describing interventions in 47 countries. Intervention efficacy was assessed using heterogenous measures ranging from trainee satisfaction to patient outcomes; only 39 reported mortality differences. Interventions that could feasibly be supported by HMA stakeholders were synthesized into a bundle of opportunities for HMA engagement designated links in a Civilian Casualty Care Chain (C-CCC).
Conclusions:
This review identified trauma care interventions with the potential to reduce mortality and disability among civilian EO/EW casualties that could be feasibly supported by HMA stakeholders. In partnership with local and multi-lateral health authorities, HMA can leverage their medical capabilities and expertise to strengthen emergency care capacity to improve trauma outcomes in settings affected by EO/EWs.
Bioturbation can increase time averaging by downward and upward movements of young and old shells within the entire mixed layer and by accelerating the burial of shells into a sequestration zone (SZ), allowing them to bypass the uppermost taphonomically active zone (TAZ). However, bioturbation can increase shell disintegration concurrently, neutralizing the positive effects of mixing on time averaging. Bioirrigation by oxygenated pore-water promotes carbonate dissolution in the TAZ, and biomixing itself can mill shells weakened by dissolution or microbial maceration, and/or expose them to damage at the sediment–water interface. Here, we fit transition rate matrices to bivalve age–frequency distributions from four sediment cores from the southern California middle shelf (50–75 m) to assess the competing effects of bioturbation on disintegration and time averaging, exploiting a strong gradient in rates of sediment accumulation and bioturbation created by historic wastewater pollution. We find that disintegration covaries positively with mixing at all four sites, in accord with the scenario where bioturbation ultimately fuels carbonate disintegration. Both mixing and disintegration rates decline abruptly at the base of the 20- to 40-cm-thick, age-homogenized surface mixed layer at the three well-bioturbated sites, despite different rates of sediment accumulation. In contrast, mixing and disintegration rates are very low in the upper 25 cm at an effluent site with legacy sediment toxicity, despite recolonization by bioirrigating lucinid bivalves. Assemblages that formed during maximum wastewater emissions vary strongly in time averaging, with millennial scales at the low-sediment accumulation non-effluent sites, a centennial scale at the effluent site where sediment accumulation was high but bioturbation recovered quickly, and a decadal scale at the second high-sedimentation effluent site where bioturbation remained low for decades. Thus, even though disintegration rates covary positively with mixing rates, reducing postmortem shell survival, bioturbation has the net effect of increasing the time averaging of skeletal remains on this warm-temperate siliciclastic shelf.
Depression is an independent risk factor for cardiovascular disease (CVD), but it is unknown if successful depression treatment reduces CVD risk.
Methods
Using eIMPACT trial data, we examined the effect of modernized collaborative care for depression on indicators of CVD risk. A total of 216 primary care patients with depression and elevated CVD risk were randomized to 12 months of the eIMPACT intervention (internet cognitive-behavioral therapy [CBT], telephonic CBT, and select antidepressant medications) or usual primary care. CVD-relevant health behaviors (self-reported CVD prevention medication adherence, sedentary behavior, and sleep quality) and traditional CVD risk factors (blood pressure and lipid fractions) were assessed over 12 months. Incident CVD events were tracked over four years using a statewide health information exchange.
Results
The intervention group exhibited greater improvement in depressive symptoms (p < 0.01) and sleep quality (p < 0.01) than the usual care group, but there was no intervention effect on systolic blood pressure (p = 0.36), low-density lipoprotein cholesterol (p = 0.38), high-density lipoprotein cholesterol (p = 0.79), triglycerides (p = 0.76), CVD prevention medication adherence (p = 0.64), or sedentary behavior (p = 0.57). There was an intervention effect on diastolic blood pressure that favored the usual care group (p = 0.02). The likelihood of an incident CVD event did not differ between the intervention (13/107, 12.1%) and usual care (9/109, 8.3%) groups (p = 0.39).
Conclusions
Successful depression treatment alone is not sufficient to lower the heightened CVD risk of people with depression. Alternative approaches are needed.
In this survey of 31 hospitals, large metropolitan facilities had a 9.5-fold odds of reporting preparedness for special pathogens; hospitals with special pathogens teams had a 14.3-fold odds of reporting preparedness for special pathogens. In the postpandemic world, healthcare institutions must invest in special pathogen responses to maximize patient safety.
To compare the agreement and cost of two recall methods for estimating children’s minimum dietary diversity (MDD).
Design:
We assessed child’s dietary intake on two consecutive days: an observation on day one, followed by two recall methods (list-based recall and multiple-pass recall) administered in random order by different enumerators at two different times on day two. We compared the estimated MDD prevalence using survey-weighted linear probability models following a two one-sided test equivalence testing approach. We also estimated the cost-effectiveness of the two methods.
Setting:
Cambodia (Kampong Thom, Siem Reap, Battambang, and Pursat provinces) and Zambia (Chipata, Katete, Lundazi, Nyimba, and Petauke districts).
Participants:
Children aged 6–23 months: 636 in Cambodia and 608 in Zambia.
Results:
MDD estimations from both recall methods were equivalent to the observation in Cambodia but not in Zambia. Both methods were equivalent to the observation in capturing most food groups. Both methods were highly sensitive although the multiple-pass method accurately classified a higher proportion of children meeting MDD than the list-based method in both countries. Both methods were highly specific in Cambodia but moderately so in Zambia. Cost-effectiveness was better for the list-based recall method in both countries.
Conclusion:
The two recall methods estimated MDD and most other infant and young child feeding indicators equivalently in Cambodia but not in Zambia, compared to the observation. The list-based method produced slightly more accurate estimates of MDD at the population level, took less time to administer and was less costly to implement.
Understanding the factors contributing to optimal cognitive function throughout the aging process is essential to better understand successful cognitive aging. Processing speed is an age sensitive cognitive domain that usually declines early in the aging process; however, this cognitive skill is essential for other cognitive tasks and everyday functioning. Evaluating brain network interactions in cognitively healthy older adults can help us understand how brain characteristics variations affect cognitive functioning. Functional connections among groups of brain areas give insight into the brain’s organization, and the cognitive effects of aging may relate to this large-scale organization. To follow-up on our prior work, we sought to replicate our findings regarding network segregation’s relationship with processing speed. In order to address possible influences of node location or network membership we replicated the analysis across 4 different node sets.
Participants and Methods:
Data were acquired as part of a multi-center study of 85+ cognitively normal individuals, the McKnight Brain Aging Registry (MBAR). For this analysis, we included 146 community-dwelling, cognitively unimpaired older adults, ages 85-99, who had undergone structural and BOLD resting state MRI scans and a battery of neuropsychological tests. Exploratory factor analysis identified the processing speed factor of interest. We preprocessed BOLD scans using fmriprep, Ciftify, and XCPEngine algorithms. We used 4 different sets of connectivity-based parcellation: 1)MBAR data used to define nodes and Power (2011) atlas used to determine node network membership, 2) Younger adults data used to define nodes (Chan 2014) and Power (2011) atlas used to determine node network membership, 3) Older adults data from a different study (Han 2018) used to define nodes and Power (2011) atlas used to determine node network membership, and 4) MBAR data used to define nodes and MBAR data based community detection used to determine node network membership.
Segregation (balance of within-network and between-network connections) was measured within the association system and three wellcharacterized networks: Default Mode Network (DMN), Cingulo-Opercular Network (CON), and Fronto-Parietal Network (FPN). Correlation between processing speed and association system and networks was performed for all 4 node sets.
Results:
We replicated prior work and found the segregation of both the cortical association system, the segregation of FPN and DMN had a consistent relationship with processing speed across all node sets (association system range of correlations: r=.294 to .342, FPN: r=.254 to .272, DMN: r=.263 to .273). Additionally, compared to parcellations created with older adults, the parcellation created based on younger individuals showed attenuated and less robust findings as those with older adults (association system r=.263, FPN r=.255, DMN r=.263).
Conclusions:
This study shows that network segregation of the oldest-old brain is closely linked with processing speed and this relationship is replicable across different node sets created with varied datasets. This work adds to the growing body of knowledge about age-related dedifferentiation by demonstrating replicability and consistency of the finding that as essential cognitive skill, processing speed, is associated with differentiated functional networks even in very old individuals experiencing successful cognitive aging.
Background: Central-line–associated bloodstream infection (CLABSI) rates increased nationally during COVID-19, the drivers of which are still being characterized in the literature. CLABSI rates doubled during the SARS-CoV-2 omicron-variant surge at our rural academic medical center. We sought to identify potential drivers of CLABSIs by comparing period- and patient-specific characteristics of this COVID-19 surge to a historical control period. Methods: We defined the study period as the time of highest COVID-19 burden at our hospital (July 2021–June 2022) and the control period as the previous 2 years (July 2019–June 2021). We compared NHSN CLABSI standardized infection ratios (SIRs), central-line standardized utilization ratios (SURs), completion of practice evaluation tools (PETs) for monitoring of central-line bundle compliance, and proportions of traveling nurses. We performed chart reviews to determine patient-specific characteristics of NHSN CLABSIs during these periods, including demographics, comorbidities, central-line characteristics and care, and microbiology. Results: The CLABSI SIR was significantly higher during the study period than the control period (0.89 vs 0.52; P = .03); the SUR was significantly higher during the study period (1.08 vs 1.02; P < .01); the PET completion per 100 central-line days was significantly lower during the study period (23.0 vs 31.5; P < .01); and the proportion of traveling nurses was significantly higher during the study period (0.20 vs 0.08; P < .01) (Fig. 1). Patients with NHSN CLABSIs during the study period were more likely to have a history of COVID-19 (27% vs 3%; P = .01) and were more likely to receive a higher level of care (60% vs 27%; P = .02). During the study period, more patients had multilumen catheters (87% vs 61%; P = .04). The type of catheter, catheter care (ie, dressing changes and chlorhexidine bathing), catheter duration before CLABSI, and associated microbiology were similar between the study and control periods (Table 1). Conclusions: During the SARS-CoV-2 omicron-variant surge, the increase in CLABSIs at our hospital was significantly associated with increased central-line utilization, decreased PET completion, and increased proportion of traveling nurses. Critical illness and multilumen catheters were significant patient-specific factors that differed between CLABSIs from the study and control periods. We did not observe differences in catheter type, duration, or catheter care. Our study highlights key modifiable risk factors for CLABSI reduction. These findings may be surrogates for other difficult-to-measure challenges related to the culture of safety during a global pandemic, such as staff education related to infection prevention and daily review of central-line necessity.
Recent meta-analyses demonstrate that small-quantity lipid-based nutrient supplements (SQ-LNS) for young children significantly reduce child mortality, stunting, wasting, anaemia and adverse developmental outcomes. Cost considerations should inform policy decisions. We developed a modelling framework to estimate the cost and cost-effectiveness of SQ-LNS and applied the framework in the context of rural Uganda.
Design:
We adapted costs from a costing study of micronutrient powder (MNP) in Uganda, and based effectiveness estimates on recent meta-analyses and Uganda-specific estimates of baseline mortality and the prevalence of stunting, wasting, anaemia and developmental disability.
Setting:
Rural Uganda.
Participants:
Not applicable.
Results:
Providing SQ-LNS daily to all children in rural Uganda (> 1 million) for 12 months (from 6 to 18 months of age) via the existing Village Health Team system would cost ∼$52 per child (2020 US dollars) or ∼$58·7 million annually. SQ-LNS could avert an average of > 242 000 disability-adjusted life years (DALYs) annually as a result of preventing 3689 deaths, > 160 000 cases of moderate or severe anaemia and ∼6000 cases of developmental disability. The estimated cost per DALY averted is $242.
Conclusions:
In this context, SQ-LNS may be more cost-effective than other options such as MNP or the provision of complementary food, although the total cost for a programme including all age-eligible children would be high. Strategies to reduce costs, such as targeting to the most vulnerable populations and the elimination of taxes on SQ-LNS, may enhance financial feasibility.
For 147 hospital-onset bloodstream infections, we assessed the sensitivity, specificity, positive predictive value, and negative predictive value of the National Healthcare Safety Network surveillance definitions of central-line–associated bloodstream infections against the gold standard of physician review, examining the drivers of discrepancies and related implications for reporting and infection prevention.
To evaluate variables that affect risk of contamination for endoscopic retrograde cholangiopancreatography and endoscopic ultrasound endoscopes.
Design:
Observational, quality improvement study.
Setting:
University medical center with a gastrointestinal endoscopy service performing ∼1,000 endoscopic retrograde cholangiopancreatography and ∼1,000 endoscopic ultrasound endoscope procedures annually.
Methods:
Duodenoscope and linear echoendoscope sampling (from the elevator mechanism and instrument channel) was performed from June 2020 through September 2021. Operational changes during this period included standard reprocessing with high-level disinfection with ethylene oxide gas sterilization (HLD–ETO) was switched to double high-level disinfection (dHLD) (June 16, 2020–July 15, 2020), and duodenoscopes changed to disposable tip model (March 2021). The frequency of contamination for the co-primary outcomes were characterized by calculated risk ratios.
Results:
The overall pathogenic contamination rate was 4.72% (6 of 127). Compared to duodenoscopes, linear echoendoscopes had a contamination risk ratio of 3.64 (95% confidence interval [CI], 0.69–19.1). Reprocessing using HLD-ETO was associated with a contamination risk ratio of 0.29 (95% CI, 0.06–1.54). Linear echoendoscopes undergoing dHLD had the highest risk of contamination (2 of 18, 11.1%), and duodenoscopes undergoing HLD-ETO and the lowest risk of contamination (0 of 53, 0%). Duodenoscopes with a disposable tip had a 0% contamination rate (0 of 27).
Conclusions:
We did not detect a significant reduction in endoscope contamination using HLD-ETO versus dHLD reprocessing. Linear echoendoscopes have a risk of contamination similar to that of duodenoscopes. Disposable tips may reduce the risk of duodenoscope contamination.
Using capture-recapture analysis we estimate the effective size of the active Amazon Mechanical Turk (MTurk) population that a typical laboratory can access to be about 7,300 workers. We also estimate that the time taken for half of the workers to leave the MTurk pool and be replaced is about 7 months. Each laboratory has its own population pool which overlaps, often extensively, with the hundreds of other laboratories using MTurk. Our estimate is based on a sample of 114,460 completed sessions from 33,408 unique participants and 689 sessions across seven laboratories in the US, Europe, and Australia from January 2012 to March 2015.
In this survey of 41 hospitals, 18 (72%) of 25 respondents reporting utilization of National Healthcare Safety Network resources demonstrated accurate central-line–associated bloodstream infection reporting compared to 6 (38%) of 16 without utilization (adjusted odds ratio, 5.37; 95% confidence interval, 1.16–24.8). Adherence to standard definitions is essential for consistent reporting across healthcare facilities.
Monoclonal antibody therapeutics to treat coronavirus disease (COVID-19) have been authorized by the US Food and Drug Administration under Emergency Use Authorization (EUA). Many barriers exist when deploying a novel therapeutic during an ongoing pandemic, and it is critical to assess the needs of incorporating monoclonal antibody infusions into pandemic response activities. We examined the monoclonal antibody infusion site process during the COVID-19 pandemic and conducted a descriptive analysis using data from 3 sites at medical centers in the United States supported by the National Disaster Medical System. Monoclonal antibody implementation success factors included engagement with local medical providers, therapy batch preparation, placing the infusion center in proximity to emergency services, and creating procedures resilient to EUA changes. Infusion process challenges included confirming patient severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) positivity, strained staff, scheduling, and pharmacy coordination. Infusion sites are effective when integrated into pre-existing pandemic response ecosystems and can be implemented with limited staff and physical resources.
The Variables and Slow Transients Survey (VAST) on the Australian Square Kilometre Array Pathfinder (ASKAP) is designed to detect highly variable and transient radio sources on timescales from 5 s to
$\sim\!5$
yr. In this paper, we present the survey description, observation strategy and initial results from the VAST Phase I Pilot Survey. This pilot survey consists of
$\sim\!162$
h of observations conducted at a central frequency of 888 MHz between 2019 August and 2020 August, with a typical rms sensitivity of
$0.24\ \mathrm{mJy\ beam}^{-1}$
and angular resolution of
$12-20$
arcseconds. There are 113 fields, each of which was observed for 12 min integration time, with between 5 and 13 repeats, with cadences between 1 day and 8 months. The total area of the pilot survey footprint is 5 131 square degrees, covering six distinct regions of the sky. An initial search of two of these regions, totalling 1 646 square degrees, revealed 28 highly variable and/or transient sources. Seven of these are known pulsars, including the millisecond pulsar J2039–5617. Another seven are stars, four of which have no previously reported radio detection (SCR J0533–4257, LEHPM 2-783, UCAC3 89–412162 and 2MASS J22414436–6119311). Of the remaining 14 sources, two are active galactic nuclei, six are associated with galaxies and the other six have no multi-wavelength counterparts and are yet to be identified.
The majority of injury deaths occur outside health facilities. However, many low- and middle-income countries (LMICs) continue to lack efficient Emergency Medical Services (EMS). Understanding current first aid practices and perceptions among members of the community is vital to strengthening non-EMS, community-based prehospital care.
Study Objective:
This study sought to determine caregiver first aid practices and care-seeking behavior for common household child injuries in rural communities in Ghana to inform context-specific interventions to improve prehospital care in LMICs.
Methods:
A cluster-randomized, population-based household survey of caregivers of children under five years in a rural sub-district (Amakom) in Ghana was conducted. Caregivers were asked about their practices and care-seeking behaviors should children sustain injuries at home. Common injuries of interest were burns, laceration, choking, and fractures. Multiple responses were permitted and reported practices were categorized as: recommended, low-risk, or potentially harmful to the child. Logistic regression was used to examine the association between caregiver characteristics and first aid practices.
Results:
Three hundred and fifty-seven individuals were sampled, representing 5,634 caregivers in Amakom. Mean age was 33 years. Most (79%) were mothers to the children; 68% had only completed basic education. Most caregivers (64%-99%) would employ recommended first aid practices to manage common injuries, such as running cool water over a burn injury or tying a bleeding laceration with a piece of cloth. Nonetheless, seven percent to 56% would also employ practices which were potentially harmful to the child, such as attempting manual removal of a choking object or treating fractures at home without taking the child to a health facility. Reporting only recommended practices ranged from zero percent (burns) to 93% (choking). Reporting only potentially harmful practices ranged from zero percent (burns) to 20% (fractures). Univariate regression analysis did not reveal consistent associations between various caregiver characteristics and the employment of recommended only or potentially harmful only first aid practices.
Conclusions:
Caregivers in rural Ghanaian communities reported using some recommended first aid practices for common household injuries in children. However, they also employed many potentially harmful practices. This study highlights the need to increase context-appropriate, community-targeted first aid training programs for rural community populations of LMICs. This is important as the home-based care provided for injured children in these communities might be the only care they receive.
The Rapid ASKAP Continuum Survey (RACS) is the first large-area survey to be conducted with the full 36-antenna Australian Square Kilometre Array Pathfinder (ASKAP) telescope. RACS will provide a shallow model of the ASKAP sky that will aid the calibration of future deep ASKAP surveys. RACS will cover the whole sky visible from the ASKAP site in Western Australia and will cover the full ASKAP band of 700–1800 MHz. The RACS images are generally deeper than the existing NRAO VLA Sky Survey and Sydney University Molonglo Sky Survey radio surveys and have better spatial resolution. All RACS survey products will be public, including radio images (with
$\sim$
15 arcsec resolution) and catalogues of about three million source components with spectral index and polarisation information. In this paper, we present a description of the RACS survey and the first data release of 903 images covering the sky south of declination
$+41^\circ$
made over a 288-MHz band centred at 887.5 MHz.
A number of genomic conditions caused by copy number variants (CNVs) are associated with a high risk of neurodevelopmental and psychiatric disorders (ND-CNVs). Although these patients also tend to have cognitive impairments, few studies have investigated the range of emotion and behaviour problems in young people with ND-CNVs using measures that are suitable for those with learning difficulties.
Methods
A total of 322 young people with 13 ND-CNVs across eight loci (mean age: 9.79 years, range: 6.02–17.91, 66.5% male) took part in the study. Primary carers completed the Developmental Behaviour Checklist (DBC).
Results
Of the total, 69% of individuals with an ND-CNV screened positive for clinically significant difficulties. Young people from families with higher incomes (OR = 0.71, CI = 0.55–0.91, p = .008) were less likely to screen positive. The rate of difficulties differed depending on ND-CNV genotype (χ2 = 39.99, p < 0.001), with the lowest rate in young people with 22q11.2 deletion (45.7%) and the highest in those with 1q21.1 deletion (93.8%). Specific patterns of strengths and weaknesses were found for different ND-CNV genotypes. However, ND-CNV genotype explained no more than 9–16% of the variance, depending on DBC subdomain.
Conclusions
Emotion and behaviour problems are common in young people with ND-CNVs. The ND-CNV specific patterns we find can provide a basis for more tailored support. More research is needed to better understand the variation in emotion and behaviour problems not accounted for by genotype.
Many perennial bioenergy grasses have the potential to escape cultivation and invade natural areas. We quantify dispersal, a key component in invasion, for two bioenergy candidates:Miscanthus sinensis and M. × giganteus. For each species, approximately 1 × 106 caryopses dispersed anemochorously from a point source into traps placed in annuli near the source (0.5 to 5 m; 1.6 to 16.4 ft) and in arcs (10 to 400 m) in the prevailing wind direction. For both species, most caryopses (95% for M. sinensis and 77% for M. × giganteus) were captured within 50 m of the source, but a small percentage (0.2 to 3%) were captured at 300 m and 400 m. Using a maximum-likelihood approach, we evaluated the degree of support in our empirical dispersal data for competing functions to describe seed-dispersal kernels. Fat-tailed functions (lognormal, Weibull, and gamma (Γ)) fit dispersal patterns best for both species overall, but because M. sinensis dispersal distances were significantly affected by wind speed, curves were also fit separately for dispersal distances in low, moderate, and high wind events. Wind speeds shifted the M. sinensis dispersal curve from a thin-tailed exponential function at low speeds to fat-tailed lognormal functions at moderate and high wind speeds. M. sinensis caryopses traveled farther in higher wind speeds (low, 30 m; moderate, 150 m; high, 400 m). Our results demonstrate the ability of Miscanthus caryopses to travel long distances and raise important implications for potential escape and invasion of fertile Miscanthus varieties from bioenergy cultivation.
The Asian grass Miscanthus sinensis (Poaceae) is being considered for use as a bioenergy crop in the U.S. Corn Belt. Originally introduced to the United States for ornamental plantings, it escaped, forming invasive populations. The concern is that naturalized M. sinensis populations have evolved shade tolerance. We tested the hypothesis that seedlings from within the invasive U.S. range of M. sinensis would display traits associated with shade tolerance, namely increased area for light capture and phenotypic plasticity, compared with seedlings from the native Japanese populations. In a common garden experiment, seedlings of 80 half-sib maternal lines were grown from the native range (Japan) and 60 half-sib maternal lines from the invasive range (U.S.) under four light levels. Seedling leaf area, leaf size, growth, and biomass allocation were measured on the resulting seedlings after 12 wk. Seedlings from both regions responded strongly to the light gradient. High light conditions resulted in seedlings with greater leaf area, larger leaves, and a shift to greater belowground biomass investment, compared with shaded seedlings. Japanese seedlings produced more biomass and total leaf area than U.S. seedlings across all light levels. Generally, U.S. and Japanese seedlings allocated a similar amount of biomass to foliage and equal leaf area per leaf mass. Subtle differences in light response by region were observed for total leaf area, mass, growth, and leaf size. U.S. seedlings had slightly higher plasticity for total mass and leaf area but lower plasticity for measures of biomass allocation and leaf traits compared with Japanese seedlings. Our results do not provide general support for the hypothesis of increased M. sinensis shade tolerance within its introduced U.S. range compared with native Japanese populations.