We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The consent process for research studies can be burdensome for potential participants due to complex information and lengthy consent forms. This pragmatic study aimed to improve the consent experience and evaluate its impact on participant decision making, study knowledge, and satisfaction with the In Our DNA SC program, a population-based genomic screening initiative. We compared two consent procedures: standard consent (SC) involving a PDF document and enhanced consent (EC) incorporating a pictograph and true or false questions. Decision-making control, study knowledge, satisfaction, and time to consent were assessed. We analyzed data for 109 individuals who completed the SC and 96 who completed the EC. Results indicated strong decision-making control and high levels of knowledge and satisfaction in both groups. While no significant differences were found between the two groups, the EC experience took longer for participants to complete. Future modifications include incorporating video modules and launching a Spanish version of the consent experience. Overall, this study contributes to the growing literature on consent improvements and highlights the need to assess salient components and explore participant preferences for receiving consent information.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Jellyfishes have ecological and societal value, but our understanding of taxonomic identity of many jellyfish species remains limited. Here, an approach integrating morphological and molecular (16S ribosomal RNA and cytochrome oxidase I) data enables taxonomic assessment of the blubber jellyfish found in the Philippines. In this study, we aimed to resolve doubt on the taxonomy of Acromitoides purpurus, a valid binomen at the time of our research. Our morphological findings confirm that this jellyfish belongs to the genus Catostylus, and is distinct from known species of the genus inhabiting the Western Pacific, such as Catostylus ouwensi, Catostylus townsendi, and Catostylus mosaicus. Detailed morphological and molecular analyses of the type specimens from the Philippines with the other Catostylus species revive the binomen Catostylus purpurus and invalidate A. purpurus. Genetic analysis also distinguishes this Philippine jellyfish from C. townsendi and C. mosaicus. Through this study, we arranged several Catostylidae taxa into species inquirendae (Catostylus tripterus, Catostylus turgescens, and Acromitoides stiphropterus) and one genus inquirenda (Acromitoides) and provided an identification key for species of Catostylus. This comprehensive study confirms the blubber jellyfish as C. purpurus, enriching our understanding of jellyfish biodiversity. The integration of morphological and genetic analyses proves vital in resolving taxonomic ambiguities within the Catostylidae family and in the accurate identification of scyphozoan jellyfishes.
As governments increasingly adopt algorithms and artificial intelligence (AAI), we still know comparatively little about citizens’ support for algorithmic government. In this paper, we analyze how many and what kind of reasons for government use of AAI citizens support. We use a sample of 17,000 respondents from 16 OECD countries and find that opinions on algorithmic government are divided. A narrow majority of people (55.6%) support a majority of reasons for using algorithmic government, and this is relatively consistent across countries. Results from multilevel models suggest that most of the cross-country variation is explained by individual-level characteristics, including age, education, gender, and income. Older and more educated respondents are more accepting of algorithmic government, while female and low-income respondents are less supportive. Finally, we classify the reasons for using algorithmic government into two types, “fairness” and “efficiency,” and find that support for them varies based on individuals’ political attitudes.
The Uniform Information Density (UID) hypothesis proposes that speakers communicate by transmitting information close to a constant rate. When choosing between two syntactic variants, it claims that speakers prefer the variant distributing information most evenly, avoiding signal peaks and troughs. If speakers prefer transmitting information uniformly, then comprehenders should also prefer a uniform signal, experiencing difficulty whenever confronted with informational peaks. However, the literature investigating this hypothesis has focused mostly on production, with only a few studies considering comprehension. In this study, we investigate comprehension in two eye-tracking experiments. Participants read sentences of two different lengths, reflecting different degrees of density, containing either a dense structure (a nominal compound, NC) or a structure that spreads the information through more words (a noun followed by a prepositional phrase, PP). Favoring the UID hypothesis, participants gazed longer at text segments following the critical structure when it was an NC than when it was a PP. They also regressed more in sentences containing longer structures. However, the pattern of results was not as clear as expected, potentially reflecting participants’ experience with the denser structure or task differences between production and comprehension. These aspects should be taken into account in future research investigating the UID hypothesis for comprehension.
Evidence from previous research suggests that frame-of-reference (FOR) training is effective at improving assessor ratings in many organizational settings. Yet no research has presented a thorough examination of systematic sources of variance (assessor-related effects, evaluation settings, and measurement design features) that might influence training effectiveness. Using a factorial ANOVA and variance components analyses on a database of four studies of frame-of-reference assessor training, we found that (a) training is most effective at identifying low levels of performance and (b) the setting of the training makes little difference with respect to training effectiveness. We also show evidence of the importance of rater training as a key determinant of the quality of performance ratings in general. Implications for FOR training theory and practice are discussed.
Infrastructure in several economies in the Global South has rapidly undergone financialization, aided and abetted by governments opening-up their infrastructure assets to global institutional investors in search of stable, predictable revenue streams. This account of financialization could be the end of the story were it not for the fact that Christophers (2015) and others have shown that institutional investors are not simply in the game of ‘finding’ value or ‘harvesting it’ from obliging states, rather they actively construct it. What often catches the eye, however, are the more overt forms of financial engineering (Ashton et al., 2012), whereas what tends to go unnoticed are the ways in which infrastructure assets are routinely ‘worked’ to generate value over time. Here, we draw attention to a slower-paced financialization of infrastructure assets where, following Chiapello (2015, 2020), investors are engaged in a continual process of evaluation and revaluation of their assets to add value over and above prevailing benchmarks. Taking the example of Canada's Ontario Teachers’ Pension Plan (OTPP) and its extensive investments in Chilean water infrastructure, this article considers how a global investment fund draws on financial practices developed in the advanced economies to add value to long term infrastructure assets in the Global South. Such practices, we argue, enact a routine form of financial subordination which does not match the familiar image of wholly subservient and dominated dependent economies. Rather, the power asymmetries involved equate less to a zero-sum game and more to a game where the benefits are unequally shared between asset managers in the Global North and states in the Global South, where effectively the latter cooperate in their own submission in ways that are not always acknowledged as such.
Data from a national survey of 348 U.S. sports field managers were used to examine the effects of participation in Cooperative Extension events on the adoption of turfgrass weed management practices. Of the respondents, 94% had attended at least one event in the previous 3 yr. Of this 94%, 97% reported adopting at least one practice as a result of knowledge gained at an Extension turfgrass event. Half of the respondents had adopted four or more practices; a third adopted five or more practices. Nonchemical, cultural practices were the most-adopted practices (65% of respondents). Multiple regression analysis was used to examine factors explaining practice adoption and Extension event attendance. Compared to attending one event, attending three events increased total adoption by an average of one practice. Attending four or more events increased total adoption by two practices. Attending four or more events (compared to one event) increased the odds of adopting six individual practices by 3- to 6-fold, depending on the practice. This suggests that practice adoption could be enhanced by encouraging repeat attendance among past Extension event attendees. Manager experience was a statistically significant predictor of the number of Extension events attended but a poor direct predictor of practice adoption. Experience does not appear to increase adoption directly, but indirectly, via its impact on Extension event attendance. In addition to questions about weed management generally, the survey asked questions specifically about annual bluegrass management. Respondents were asked to rank seven sources of information for their helpfulness in managing annual bluegrass. There was no single dominant information source, but Extension was ranked more than any other source as the most helpful (by 22% of the respondents) and was ranked among the top three by 53%, closely behind field representative/local distributor sources at 54%.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
To compare supraglottoplasty versus non-surgical treatment in children with laryngomalacia and mild, moderate and severe obstructive sleep apnoea.
Methods
Patients were classified based on their obstructive apnoea hypopnoea index on initial polysomnogram, which was compared to their post-treatment polysomnogram.
Results
Eighteen patients underwent supraglottoplasty, and 12 patients had non-surgical treatment. The average obstructive apnoea hypopnoea index after supraglottoplasty fell by 12.68 events per hour (p = 0.0039) in the supraglottoplasty group and 3.3 events per hour (p = 0.3) in the non-surgical treatment group. Comparison of the change in obstructive apnoea hypopnoea index in the surgical versus non-surgical groups did not meet statistical significance (p = 0.09).
Conclusion
All patients with laryngomalacia and obstructive sleep apnoea had a statistically significant improvement in obstructive apnoea hypopnoea index after supraglottoplasty irrespective of obstructive sleep apnoea severity, whereas patients who received non-surgical treatment had more variable and unpredictable results. Direct comparison of the change between the two groups did not find supraglottoplasty to be superior to non-surgical treatment. Larger prospective studies are recommended.
The Food and Drug Administration (FDA) reviews safety, efficacy, and the quality of medical devices through its regulatory process. The FDA Safety and Innovation Act (FDASIA) of 2012 was aimed at accelerating the regulatory process for medical devices.
Objectives:
The purpose of our study was to (1) quantify characteristics of pivotal clinical trials (PCTs) supporting the premarket approval of endovascular medical devices and (2) analyze trends over the last two decades in light of the FDASIA.
Methods:
We surveyed the study designs of endovascular devices with PCTs from the US FDA pre-market approval medical devices database. The effect of FDASIA on key design parameters (e.g., randomization, masking, and number of enrolled patients) was estimated using an interrupted time series analysis (segmented regression).
Results:
We identified 117 devices between 2000–2018. FDASIA was associated with a decrease in double blinding (p < 0.0001) and a decrease in historical comparators (p < 0.0001).
Discussion:
Our results reveal an overall trend of decreased regulatory requirements as it relates to clinical trial characteristics, but a compensatory increased rate of post-approval across device classes. Furthermore, there was an emphasis on proving equivalence or non-inferiority rather than more use of active comparators in clinical trials. Medical device stakeholders, notably clinicians, must be aware of the shifting regulatory landscape in order to play an active role in promoting patient safety.
Excavations at Dunmore Road, Abingdon (formerly Berks.) uncovered activity dating from the Neolithic to the early Roman period. Following some ephemeral traces of Neolithic and Bronze Age activity, the earliest clear evidence of settlement was represented in the early Iron Age by a series of post-built and ditched roundhouses, numerous pits, and four- and six-post structures. Middle Iron Age activity was represented primarily by a series of enclosures accompanied by an inhumation burial and several pits. One of the enclosures was recut in the late Iron Age and a larger adjoining enclosure was established during this time. The larger enclosure was recut three times in the early Roman period, showing continuity in local activity, which also saw the construction of a probable masonry building. A previously unknown Roman road, flanked by ditches c.20–28 m apart with layers of metalling in between, was found extending across the site. Projection of the road alignment southwards connects it to the late Iron Age oppidum and Roman nucleated settlement at Abingdon. No road has previously been found that links Abingdon to the main Roman road network. Activity ceased in the early second century AD, around the time of settlement and landscape reorganisation observed more widely in the Abingdon area. The road does not appear to have been refurbished thereafter, and the extent to which it continued in use through the later Roman period is unknown. Medieval furrows crossed the site on the same alignment as the Iron Age and early Roman enclosures and perpendicular to the Roman road. However, the furrows may have been aligned upon Wootton Road to the west rather than indicating any influence from the late prehistoric or Roman remains.
Oxford Archaeology (OA) undertook an archaeological excavation in advance of residential development to the north of Dunmore Road in Abingdon in 2018. The excavation area was centred at SU 49170 98768 and covered c.2.48 ha (Fig. 1). It lies within the south-western part of the Dunmore Road development site, which extended across c.9.5 ha. The River Stert defines the north-eastern side of the development site and joins the River Thames c.1.95 km south of the site. The site is located at 64 m above OD and has a slight slope from north to south.
At the broadest systems level, there are several possible national healthcare systems. Hypothetically, there might be a free-market approach to healthcare, in which there would be little or no government regulation. No country has implemented such a system, and even if it were possible, it is not clear that burnout risk to healthcare providers would be reduced. More familiarly, the socialized medicine approach is implemented in many parts of the world. Such a system, in which the government provides healthcare, free to the patient and paid for by taxes, has many well-known pros and cons. The hybrid system, as seen in the United States, combines elements of the free-market and the socialized medicine approaches, and also has its pros and cons. There is growing interest in so-called universal healthcare, which tilts the hybrid system a bit more in the direction of socialized medicine. As with the other national system options, there is no clear-cut impact on burnout with universal healthcare. At present, no existing national healthcare system is structured to reduce burnout among healthcare providers.
The prevailing business model in which most medical practices, hospitals, and larger healthcare networks operate is a volume-based, fee-for-service model. Income to the healthcare provider is based substantially on the number of patients seen. As a consequence, there is pressure on the healthcare provider to see as many patients as possible. It is well known that such an “assembly-line, piece-rate-pay” approach is a major factor in promoting burnout. One alternative is the concierge medical practice, and in the United States this alternative is growing modestly among primary care physicians, where it is most clearly applicable. A more widely applicable alternative is quality-based compensation, though actually determining the relevant metrics of quality and administering such a system have proven problematic. As yet there is no clear-cut alternative to the fee-for-service model, but there is widespread agreement that the unintended consequences of the model are increased risks of burnout. The spike in telemedicine brought on by the pandemic shows that when changes in healthcare are seen as imperative, systems-level strategies can indeed change, and quickly.
Beyond understanding one’s own personality, possible associated risk factors for burnout, and personality-based strategies to reduce that risk, it is critically important for individuals to attend to self-care. Self-care strategies are relatively easily implemented and are largely under the control of the individual. Such self-care strategies center on rest and relaxation, healthy diet, and regular exercise. We present and elaborate on a list of “Ten Pillars” of self-care, which include health literacy; mental well-being; physical activity; healthy eating; risk-avoidance or mitigation; good hygiene; rational and responsible use of products, services, diagnostics, and medicines; social self-care; emotional self-care; and a comprehensive workplace self-care strategy. Active self-care on the part of physicians and other healthcare providers can influence similar behaviors on the part of their coworkers. We emphasize that the self-care strategies we overview are largely under the control of the individual and are relatively easily implemented. They should be a primary focus of all healthcare providers.
The data strongly support the position that social support is a powerful source of stress reduction and, thus, a valuable tool for managing burnout. Indeed, much of the stress-management literature suggests that social support is the single most powerful of the whole array of stress-reduction strategies available to and recommended for individuals. Such comfort and support may center on emotional (expressing empathy), instrumental (giving tangible assistance), informational (giving advice), or reappraisal (reframing, suggesting different ways of looking at the stressors) help. Most commonly, social support will comprise a combination of those four types. A critical factor in the effectiveness of social support is the level of trust between the receiver and giver of such support. As widely accepted as the critical role of social support is, healthcare workers generally underutilize their support systems, formal or informal, for a variety of reasons. The ongoing pandemic has made this powerful stress-reduction/burnout-reduction strategy all the more critical.
As important as individual solutions and team-based solutions are, there is a critical third level of potential solutions that can and should be implemented, namely, system-level solutions. Some of these broader solutions can be implemented fairly directly by leaders in local healthcare systems, including individual practices and hospitals. One especially important “local systems” solution is workflow analysis and workflow simplification. As valuable as improving workflow is, it is a challenge to persuade leaders to engage in such a change process, and it is a challenge to persuade those who would benefit from workflow simplification to actually change their behavior. The pandemic is providing a major “reset” opportunity, the chance to rethink how we do things in general and, more particularly, in healthcare.