We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Commercializing targeted sprayer systems allows producers to reduce herbicide inputs but risks the possibility of not treating emerging weeds. Currently, targeted applications with the John Deere system allow for five spray sensitivity settings, and no published literature discusses the impact of these settings on detecting and spraying weeds of varying species, sizes, and positions in crops. Research was conducted in AR, IL, IN, MS, and NC in corn, cotton, and soybean to determine how various factors might influence the ability of targeted applications to treat weeds. These data included 21 weed species aggregated to six classes with height, width, and densities, ranging from 25 to 0.25 cm, 25 to 0.25 cm, and 14.3 to 0.04 plants m-2, respectively. Crop and weed density did not influence the likelihood of treating the weeds. As expected, the sensitivity setting alters the ability to treat weeds. Targeted applications (across sensitivity settings, median weed height and width, and density of 2.4 plants m-2) resulted in a treatment success of 99.6% to 84.4%, 99.1% to 68.8%, 98.9% to 62.9%, 99.1% to 70.3%, 98.0% to 48.3%, and 98.5% to 55.8% for Convolvulaceae, decumbent broadleaf weeds, Malvaceae, Poaceae, Amaranthaceae, and yellow nutsedge, respectively. Reducing the sensitivity setting reduced the ability to treat weeds. Size of weeds aided targeted application success, with larger weeds being more readily treated through easier detection. Based on these findings, various conditions could impact the outcome of targeted multi-nozzle applications. Additionally, the analyses highlight some of the parameters to consider when using these technologies.
Grand solar minima are periods spanning from decades to more than a century during which solar activity is unusually low. A cluster of such minima occurred during the last millennium, as evidenced by reductions in the numbers of sunspots observed and coeval increases in cosmogenic isotope production. Prior to the period of instrumental records, natural archives of such isotopes are the only resources available for detecting grand solar minima. Here, we examine the period 433–315 BCE, which saw a sustained increase in the production of the cosmogenic isotope, radiocarbon. Our new time series of radiocarbon data (Δ14C), obtained on cellulose extracted from known-age oak tree rings from Germany, reveal that the rise in production that occurred at this time was commensurate with patterns observed over recent grand solar minima. Our data also enhance, and to a degree challenge, the accuracy of the international atmospheric radiocarbon record over this period.
Abacs approximating the product-moment correlation for both explicit and implicit selection are presented. These abacs give accuracy to within .01 of the corresponding analytic estimate.
Creating a sustainable residency research program is necessary to develop a sustainable research pipeline, as highlighted by the recent Society for Academic Emergency Medicine 2024 Consensus Conference. We sought to describe the implementation of a novel, immersive research program for first-year emergency medicine residents. We describe the curriculum development, rationale, implementation process, and lessons learned from the implementation of a year-long research curriculum for first-year residents. We further evaluated resident perception of confidence in research methodology, interest in research, and the importance of their research experience through a 32-item survey. In two cohorts, 25 first-year residents completed the program. All residents met their scholarly project requirements by the end of their first year. Two conference abstracts and one peer-reviewed publication were accepted for publication, and one is currently under review. Survey responses indicated that there was an increase in residents’ perceived confidence in research methodology, but this was limited by the small sample size. In summary, this novel resident research curriculum demonstrated a standardized, reproducible, and sustainable approach to provide residents with an immersive research program.
Although the link between alcohol involvement and behavioral phenotypes (e.g. impulsivity, negative affect, executive function [EF]) is well-established, the directionality of these associations, specificity to stages of alcohol involvement, and extent of shared genetic liability remain unclear. We estimate longitudinal associations between transitions among alcohol milestones, behavioral phenotypes, and indices of genetic risk.
Methods
Data came from the Collaborative Study on the Genetics of Alcoholism (n = 3681; ages 11–36). Alcohol transitions (first: drink, intoxication, alcohol use disorder [AUD] symptom, AUD diagnosis), internalizing, and externalizing phenotypes came from the Semi-Structured Assessment for the Genetics of Alcoholism. EF was measured with the Tower of London and Visual Span Tasks. Polygenic scores (PGS) were computed for alcohol-related and behavioral phenotypes. Cox models estimated associations among PGS, behavior, and alcohol milestones.
Results
Externalizing phenotypes (e.g. conduct disorder symptoms) were associated with future initiation and drinking problems (hazard ratio (HR)⩾1.16). Internalizing (e.g. social anxiety) was associated with hazards for progression from first drink to severe AUD (HR⩾1.55). Initiation and AUD were associated with increased hazards for later depressive symptoms and suicidal ideation (HR⩾1.38), and initiation was associated with increased hazards for future conduct symptoms (HR = 1.60). EF was not associated with alcohol transitions. Drinks per week PGS was linked with increased hazards for alcohol transitions (HR⩾1.06). Problematic alcohol use PGS increased hazards for suicidal ideation (HR = 1.20).
Conclusions
Behavioral markers of addiction vulnerability precede and follow alcohol transitions, highlighting dynamic, bidirectional relationships between behavior and emerging addiction.
The prospects for Australia’s relations with its most immediate region at the beginning of the decade seemed bleak indeed. On the surface at least, they continued to be troubled as the rise of terrorism and people smuggling as major security issues, and Australia’s participation in the 2003 invasion of Iraq, introduced new sources of tension into Australia’s relations with its northern neighbours. Yet by 2005 the picture had changed remarkably. John Howard’s government, despite being dogged by diplomatic gaffes and pilloried by its critics, had achieved some remarkable successes in Australia’s relations with South-East Asia. Bilateral trade agreements had been signed or were under negotiation with the region’s major economies. Talks had begun on a new security agreement with Indonesia, and the Australian Prime Minister seemed to have forged a close rapport with the first directly elected Indonesian President. Perhaps most significantly, Howard was invited to two meetings crucial to the development of a new East Asian regional institution: the Association of South-East Asian Nations (ASEAN) Summit in November 2004 and the inaugural East Asia Summit in December 2005.
The machinery of Australia’s foreign policy-making was transformed during the first decade of the twenty-first century, perhaps more profoundly than at any stage since the creation of an independent Department of External Affairs in November 1935. Until that time, the foreign affairs function of the Commonwealth government had been administered from within the Prime Minister’s Department. From its modest beginnings in 1935 in a clutch of rooms on the ground floor of Canberra’s West Block administrative building, the Department of External Affairs, then Foreign Affairs, then Foreign Affairs and Trade (DFAT) grew steadily in size and confidence. When DFAT moved into its imposing new headquarters on the edge of State Circle in 1996, it symbolised a coming of age of a powerful, confident bureau of state with full and independent stewardship of the nation’s foreign affairs. While prime ministers from Sir Robert Menzies to Paul Keating may have felt strongly about particular international causes, few questioned that DFAT and its ministers played the central role in initiating and implementing policy across the full suite of Australia’s international interests.
Beginning with the floating of the Thai baht on 2 July 1997, a regional crisis unfolded that saw the magic disappear from the economies of East Asia. What appeared initially to be merely the sharp devaluation of a single currency turned into an economic free-fall that rippled across neighbouring economies and eventually the entire region. By early September 1997, the Malaysian ringgit had fallen to its lowest level against the US dollar since 1971; in the space of six months the Thai stock-market had lost 38 per cent of its value; Malaysia’s lost 44 per cent, the Philippines’ lost 35 per cent; Indonesia’s lost 17 per cent; and Japan’s lost 4 per cent. By year’s end, the Indonesian and South Korean economies had been brought to their knees, and speculation had begun that East Asia would drag the global economy into a bout of chronic deflation.
We show that if an open set in $\mathbb{R}^d$ can be fibered by unit n-spheres, then $d \geq 2n+1$, and if $d = 2n+1$, then the spheres must be pairwise linked, and $n \in \left\{0, 1, 3, 7 \right\}$. For these values of n, we construct unit n-sphere fibrations in $\mathbb{R}^{2n+1}$.
While grain farming has seen a major shift toward organic production in recent years, the USA continues to lag behind with domestic demand continuing to outpace domestic supply, making the USA an all-around net importer. The Midwestern USA is poised to help remedy this imbalance; however, farmers continue to slowly transition to organic production systems. Existing literature has identified three prevalent narratives that farmers use to frame their organic transition: environmentalism, farm-family legacy and economic factors, in addition to a four and untested religiosity narrative. This study sought to better understand how these different narratives frame grain farmers’ thought processes for transitioning from conventional production systems to certified organic production systems. We co-created narratives around organic production with farmers, which resulted in four passages aligned with the literature: farm-family legacy, economic values, environmental values and Christianity and stewarding Eden. Then, we mailed a paper survey to conventional, in transition and certified organic Indiana grain farmers in order to test how these different narratives motivated organic production. We found that the most prevalent narrative around organic production is the farm-family legacy, which specifically resonated with midsize farmers. We also found that the religious stewardship narrative resonated with a substantial number of organic and mixed practice farmers, which is likely due to Amish farmers within the sample. These results shed light on the role that narratives and associated values play in organic practice use and can inform the organic efforts of agricultural professionals.
Narrow-windrow burning (NWB) is a form of harvest weed seed control in which crop residues and weed seeds collected by the combine are concentrated into windrows and subsequently burned. The objectives of this study were to determine how NWB will 1) affect seed survival of Italian ryegrass in wheat and Palmer amaranth in soybean and 2) determine whether a relationship exists between NWB heat index (HI; the sum of temperatures above ambient) or effective burn time (EBT; the cumulative number of seconds temperatures exceed 200 C) and the post-NWB seed survival of both species. Average soybean and wheat windrow HI totaled 140,725 ± 14,370 and 66,196 ± 6224 C, and 259 ± 27 and 116 ± 12 s of EBT, respectively. Pre-NWB versus post-NWB germinability testing revealed an estimated seed kill rate of 79.7% for Italian ryegrass, and 86.3% for Palmer amaranth. Non-linear two-parameter exponential regressions between seed kill and HI or EBT indicated NWB at an HI of 146,000 C and 277 s of EBT potentially kills 99% of Palmer amaranth seed. Seventy-six percent of soybean windrow burning events resulted in estimated Palmer amaranth seed kill rates greater than 85%. Predicted Italian ryegrass seed kill was greater than 97% in all but two wheat NWB events; therefore, relationships were not calculated. These results validate the effectiveness of the ability of NWB to reduce seed survival, thereby improving weed management and combating herbicide resistance.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
The coronavirus disease 2019 (COVID-19) pandemic has significantly increased depression rates, particularly in emerging adults. The aim of this study was to examine longitudinal changes in depression risk before and during COVID-19 in a cohort of emerging adults in the U.S. and to determine whether prior drinking or sleep habits could predict the severity of depressive symptoms during the pandemic.
Methods
Participants were 525 emerging adults from the National Consortium on Alcohol and NeuroDevelopment in Adolescence (NCANDA), a five-site community sample including moderate-to-heavy drinkers. Poisson mixed-effect models evaluated changes in the Center for Epidemiological Studies Depression Scale (CES-D-10) from before to during COVID-19, also testing for sex and age interactions. Additional analyses examined whether alcohol use frequency or sleep duration measured in the last pre-COVID assessment predicted pandemic-related increase in depressive symptoms.
Results
The prevalence of risk for clinical depression tripled due to a substantial and sustained increase in depressive symptoms during COVID-19 relative to pre-COVID years. Effects were strongest for younger women. Frequent alcohol use and short sleep duration during the closest pre-COVID visit predicted a greater increase in COVID-19 depressive symptoms.
Conclusions
The sharp increase in depression risk among emerging adults heralds a public health crisis with alarming implications for their social and emotional functioning as this generation matures. In addition to the heightened risk for younger women, the role of alcohol use and sleep behavior should be tracked through preventive care aiming to mitigate this looming mental health crisis.
We sought to determine who is involved in the care of a trauma patient.
Methods:
We recorded hospital personnel involved in 24 adult Priority 1 trauma patient admissions for 12 h or until patient demise. Hospital personnel were delineated by professional background and role.
Results:
We cataloged 19 males and 5 females with a median age of 50-y-old (interquartile range [IQR], 35.5-67.5). The average number of hospital personnel involved was 79.71 (standard deviation, 17.62; standard error 3.6). A median of 51.2% (IQR, 43.4%-59.8%) of personnel were first involved within hour 1. More personnel were involved in direct versus indirect care (median 54.5 [IQR, 47.5-67.0] vs 25.0 [IQR, 22.0-30.5]; P < 0.0001). Median number of health-care professionals and auxiliary staff were 74.5 (IQR, 63.5-90.5) and 6.0 (IQR, 5.0-7.0), respectively. More personnel were first involved in hospital locations external to the emergency department (median, 53.0 [IQR, 41.5-63.0] vs 27.5 [IQR, 24.0-30.0]; P < 0.0001). No differences existed in total personnel by Injury Severity Score (P = 0.1266), day (P = 0.7270), or time of admission (P = 0.2098).
Conclusions:
A large number of hospital personnel with varying job responsibilities respond to severe trauma. These data may guide hospital staffing and disaster preparedness policies.
Yield losses due to weeds are a major threat to wheat production and economic well-being of farmers in the United States and Canada. The objective of this Weed Science Society of America (WSSA) Weed Loss Committee report is to provide estimates of wheat yield and economic losses due to weeds. Weed scientists provided both weedy (best management practices but no weed control practices) and weed-free (best management practices providing >90% weed control) average yield from replicated research trials in both winter and spring wheat from 2007 to 2017. Winter wheat yield loss estimates ranged from 2.9% to 34.4%, with a weighted average (by production) of 25.6% for the United States, 2.9% for Canada, and 23.4% combined. Based on these yield loss estimates and total production, the potential winter wheat loss due to weeds is 10.5, 0.09, and 10.5 billion kg with a potential loss in value of US$2.19, US$0.19, and US$2.19 billion for the United States, Canada, and combined, respectively. Spring wheat yield loss estimates ranged from 7.9% to 47.0%, with a weighted average (by production) of 33.2% for the United States, 8.0% for Canada, and 19.5% combined. Based on this yield loss estimate and total production, the potential spring wheat loss is 4.8, 1.6, and 6.6 billion kg with a potential loss in value of US$1.14, US$0.37, and US$1.39 billion for the United States, Canada, and combined, respectively. Yield loss in this analysis is greater than some previous estimates, likely indicating an increasing threat from weeds. Climate is affecting yield loss in winter wheat in the Pacific Northwest, with percent yield loss being highest in wheat-fallow systems that receive less than 30 cm of annual precipitation. Continued investment in weed science research for wheat is critical for continued yield protection.
The direct carbonate procedure for accelerator mass spectrometry radiocarbon (AMS 14C) dating of submilligram samples of biogenic carbonate without graphitization is becoming widely used in a variety of studies. We compare the results of 153 paired direct carbonate and standard graphite 14C determinations on single specimens of an assortment of biogenic carbonates. A reduced major axis regression shows a strong relationship between direct carbonate and graphite percent Modern Carbon (pMC) values (m = 0.996; 95% CI [0.991–1.001]). An analysis of differences and a 95% confidence interval on pMC values reveals that there is no significant difference between direct carbonate and graphite pMC values for 76% of analyzed specimens, although variation in direct carbonate pMC is underestimated. The difference between the two methods is typically within 2 pMC, with 61% of direct carbonate pMC measurements being higher than their paired graphite counterpart. Of the 36 specimens that did yield significant differences, all but three missed the 95% significance threshold by 1.2 pMC or less. These results show that direct carbonate 14C dating of biogenic carbonates is a cost-effective and efficient complement to standard graphite 14C dating.
Hairy buttercup and cutleaf evening primrose are winter annual weeds that have become more problematic for winter wheat growers in the southern Great Plains and the midsouthern United States in recent years. Little research exists on which to base recommendations for controlling hairy buttercup in wheat, and little research has been published on cutleaf evening primrose control in recent years. With growing concerns of increased herbicide resistance among winter annual weeds, incorporating new herbicide sites of action has become necessary. The objective of this study was to assess halauxifen-methyl as a novel herbicide to control these two problematic winter annual broadleaf weeds in winter wheat in Mississippi and Oklahoma. Studies were conducted across four site-years in Mississippi and one site-year in Oklahoma comparing 15 herbicide programs with and without halauxifen-methyl. Hairy buttercup and cutleaf evening-primrose control was the greatest when a synthetic auxin was combined with an acetolactate synthase–inhibiting herbicide. Treatments including halauxifen-methyl resulted in the greatest control of hairy buttercup, whereas a synthetic auxin herbicide plus chlorsulfuron and metsulfuron resulted in the greatest control of cutleaf evening primrose. Halauxifen-methyl is an effective addition for control of winter annual broadleaf weeds like hairy buttercup and cutleaf evening primrose in winter wheat.