We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Several organizations including the Environmental Protection Agency, World Health Organization and American Academy of Pediatrics recommend that hospital sound levels not exceed 45 decibels. Yet, several studies across multiple age groups have observed higher than recommended levels in the intensive care setting. Elevated sound levels in hospitals have been associated with disturbances in sleep, patient discomfort, delayed recovery, and delirium.
Methods:
We measured sound levels in a pediatric cardiac intensive care unit and collected vital signs data, sedation dosing and delirium scores. During a 5-week study period, sound levels for 68 patients in 22 private and 4 semi-private rooms were monitored.
Results:
Sound levels were consistently above stated recommendations with an average daytime level of 50.6 decibels (maximum, 76.9 decibels) and an average nighttime level of 49.5 decibels (maximum, 69.6 decibels). An increase in average and maximum sound levels increased the probability of sedation administration the following hour (p-value < 0.001 and 0.01, respectively) and was predictive of an increase in heart rate and blood pressure (p-value < 0.001).
Conclusion:
Sound levels in the CICU were consistently higher than recommended. An increase in heart rate, blood pressure and sedation utilization may suggest a stress response to persistent and sudden loud sounds. Given known negative impacts of excessive noise on stress, sleep, and brain development, as well as the similar adverse effects from the related use of sedative medications, reducing excessive and sudden noise may provide an opportunity to improve short- and long-term hemodynamic and neurodevelopmental outcomes in the pediatric cardiac intensive care unit.
Changing practice patterns caused by the pandemic have created an urgent need for guidance in prescribing stimulants using telepsychiatry for attention-deficit hyperactivity disorder (ADHD). A notable spike in the prescribing of stimulants accompanied the suspension of the Ryan Haight Act, allowing the prescribing of stimulants without a face-to-face meeting. Competing forces both for and against prescribing ADHD stimulants by telepsychiatry have emerged, requiring guidelines to balance these factors. On the one hand, factors weighing in favor of increasing the availability of treatment for ADHD via telepsychiatry include enhanced access to care, reduction in the large number of untreated cases, and prevention of the known adverse outcomes of untreated ADHD. On the other hand, factors in favor of limiting telepsychiatry for ADHD include mitigating the possibility of exploiting telepsychiatry for profit or for misuse, abuse, and diversion of stimulants. This Expert Consensus Group has developed numerous specific guidelines and advocates for some flexibility in allowing telepsychiatry evaluations and treatment without an in-person evaluation to continue. These guidelines also recognize the need to give greater scrutiny to certain subpopulations, such as young adults without a prior diagnosis or treatment of ADHD who request immediate-release stimulants, which should increase the suspicion of possible medication diversion, misuse, or abuse. In such cases, nonstimulants, controlled-release stimulants, or psychosocial interventions should be prioritized. We encourage the use of outside informants to support the history, the use of rating scales, and having access to a hybrid model of both in-person and remote treatment.
To measure the impact of an automated hand hygiene monitoring system (AHHMS) and an intervention program of complementary strategies on hand hygiene (HH) performance in both acute-care and long-term care (LTC) units.
Single Veterans Affairs Medical Center (VAMC), with 2 acute-care units and 6 LTC units.
Methods:
An AHHMS that provides group HH performance rates was implemented on 8 units at a VAMC from March 2021 through April 2022. After a 4-week baseline period and 2.5-week washout period, the 52-week intervention period included multiple evidence-based components designed to improve HH compliance. Unit HH performance rates were expressed as the number of dispenses (events) divided by the number of patient room entries and exits (opportunities) × 100. Statistical analysis was performed with a Poisson general additive mixed model.
Results:
During the 4-week baseline period, the median HH performance rate was 18.6 (95% CI, 16.5–21.0) for all 8 units. During the intervention period, the median HH rate increased to 21.6 (95% CI, 19.1–24.4; P < .0001), and during the last 4 weeks of the intervention period (exactly 1 year after baseline), the 8 units exhibited a median HH rate of 25.1 (95% CI, 22.2–28.4; P < .0001). The median HH rate increased from 17.5 to 20.0 (P < .0001) in LTC units and from 22.9 to 27.2 (P < .0001) in acute-care units.
Conclusions:
The intervention was associated with increased HH performance rates for all units. The performance of acute-care units was consistently higher than LTC units, which have more visitors and more mobile veterans.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
OBJECTIVES/GOALS: Supported by the State of Alabama, the Alabama Genomic Health Initiative (AGHI) is aimed at preventing and treating common conditions with a genetic basis. This joint UAB Medicine-HudsonAlpha Institute for Biotechnology effort provides genomic testing, interpretation, and counseling free of charge to residents in each of Alabama’s 67 counties. METHODS/STUDY POPULATION: Launched in 2017, as a state-wide population cohort, AGHI (1.0) enrolled 6,331 Alabamians and returned individual risk of disease(s) related to the ACMG SF v2.0 medically actionable genes. In 2021, the cohort was expanded to include a primary care cohort. AGHI (2.0) has enrolled 750 primary care patients, returning individual risk of disease(s) related to the ACMG SF v3.1 gene list and pre-emptive pharmacogenetics (PGx) to guide medication therapy. Genotyping is done on the Illumina Global Diversity Array with Sanger sequencing to confirm likely pathogenic / pathogenic variants in medically actionable genes and CYP2D6 copy number variants using Taqman assays, resulting in a CLIA-grade report. Disease risk results are returned by genetic counselors and Pharmacogenetics results are returned by Pharmacists. RESULTS/ANTICIPATED RESULTS: We have engaged a statewide community (>7000 participants), returning 94 disease risk genetic reports and 500 PGx reports. Disease risk reports include increased predisposition to cancers (n=38), cardiac diseases (n=33), metabolic (n=12), other (n=11). 100% of participants harbor an actionable PGx variant, 70% are on medication with PGx guidance, 48% harbor PGx variants and are taking medications affected. In 10% of participants, pharmacists sent an active alert to the provider to consider/ recommend alternative medication. Most commonly impacted medications included antidepressants, NSAIDS, proton-pump inhibitors and tramadol. To enable the EMR integration of genomic information, we have developed an automated transfer of reports into the EMR with Genetics Reports and PGx reports viewable in Cerner. DISCUSSION/SIGNIFICANCE: We share our experience on pre-emptive implementation of genetic risk and pharmacogenetic actionability at a population and clinic level. Both patients and providers are actively engaged, providing feedback to refine the return of results. Real time alerts with guidance at the time of prescription are needed to ensure future actionability and value.
The Dextroamphetamine Transdermal System (d-ATS) was developed as an alternative to oral amphetamine (AMP) formulations for ADHD. In a pivotal study, d-ATS met primary and secondary efficacy endpoints for ADHD in children and adolescents. Study subjects wore d-ATS for 9 hours, and an improvement in Swanson, Kotkin, Agler, M-Flynn, and Pelham scale (SKAMP) total score was observed from 2 through 12 hours after application. Patients with ADHD may need varying durations of treatment for symptoms from day to day. This analysis describes the exposure-response (E-R) relationship for d-ATS and explores possible outcomes for wear times ≤9 hours under varying assumptions.
Methods
A population pharmacokinetic (PK) model was developed to describe AMP disposition following d-ATS administration. This model was used to construct a population pharmacokinetic/pharmacodynamic (PK/PD) model from SKAMP total score data from two pediatric clinical studies to characterize onset and duration of effect after d-ATS administration. The integrated PK/PD model was used to describe the d-ATS E-R relationship and simulate the potential onset and duration of effect of d-ATS in response to various removal times (when <9 hours) by utilizing SKAMP scores as the efficacy measure. Subject-level AMP PK and SKAMP profiles were simulated for d-ATS removal at 4–9 hours post-application under different assumptions for AMP absorption after early patch removal. Modifications were made to the original population PK model to simulate patch removal.
Results
Data from 81 children and 41 adolescents, 6–17 years old, were included. The model provided a reasonable description of the SKAMP score over time, showing an initial decline ~2 hours after patch application. In approximately 50% of children and adolescents, the maximum decline in SKAMP scores was observed within the first 4 hours after patch application. Earlier simulated d-ATS removal times were associated with reduced systemic exposure and earlier return to near-baseline scores across the range of assumptions tested.
Under different assumptions, the graphs changed modestly but not dramatically. For example, with moderate/conservative assumptions, following a 9-hour wear time, SKAMP scores returned to within 90% of baseline value in ~49% of subjects by 12 hours and ~80% of subjects by 16 hours. Following a 4-hour wear time, percentages were ~74% by 12 hours and ~95% by 16 hours.
Conclusions
Simulation results suggest that the duration of d-ATS efficacy may be related to wear time, which can be adjusted according to treatment needs, consistent with published observations for another transdermal stimulant. The d-ATS patch provides the ability to control medication exposure by shortening wear time, allowing treatment duration to be individualized and optimized in ADHD patients who have varying schedules and needs.
When former Thai general, coup-maker and junta leader Prayuth Chan-ocha swore an oath of loyalty to Thailand’s monarch following the 26 March 2019 election, he and his cabinet omitted a pledge to ‘uphold and observe the Constitution of the Kingdom of Thailand in every respect’ (‘Ombudsman refers oath blunder to Constitutional Court’, Bangkok Post, 27 August 2019). To this date, the omission has not been rectified and the Constitutional Court refused to accept a case on its unconstitutionality. If it were needed, after 18 coups producing 19 constitutions (Harding and Leyland, 2011, p xxx), the absent oath eloquently conveyed the uncertain place of constitutional law in the governance of Thailand, particularly in the eyes of its two most important institutions, the military and the monarchy.
Western civil–military relations, and especially the principle of civilian control of the military, reside in a framework in which rule of law is fundamental. It is law that defines the relations between civilian and military leaders. So what kind of rule of law applies in Thailand? While Thailand formally adopted a Western criminal code in 1908, and established a Constitutional Court in 1997, Thailand’s practice of law remains far from that of Western states (Wise, 2019). This applies at the top echelons of the legal system, with judges demonstrably unable to truly accept the constitution as the highest law. It also applies in civil matters, with northern Thais preferring Buddhist ethics and karmic justice to courts and litigation for dealing with the wrongdoings encountered in their daily lives. This is not to say that Thai rule of law is unchanging. The reforms of 1997, including the establishing of independent organizations such as anti-corruption and electoral commissions, and indeed the constitutional court, have brought the law into new spheres and established new norms.
The relative weakness of rule of law and especially constitutionalism, however, reflects the enduring coexistence of modern and traditional notions of legitimacy in governance. Some observe that Thailand’s history, as a polity never formally colonized, has provided greater scope for continuity with its pre-colonial past (Wise, 2019, p xvi).
In this paper, we answer the multiple calls for systematic analysis of paradigms and subdisciplines in political science—the search for coherence within a fragmented field. We collected a large dataset of over seven hundred thousand writings in political science from Web of Science since 1946. We found at least two waves of political science development, from behaviorism to new institutionalism. Political science appeared to be more fragmented than literature suggests—instead of ten subdisciplines, we found 66 islands. However, despite fragmentation, there is also a tendency for integration in contemporary political science, as revealed by co-existence of several paradigms and coherent and interconnected topics of the “canon of political science,” as revealed by the core-periphery structure of topic networks. This was the first large-scale investigation of the entire political science field, possibly due to newly developed methods of bibliometric network analysis: temporal bibliometric analysis and island methods of clustering. Methodological contribution of this work to network science is evaluation of islands method of network clustering against a hierarchical cluster analysis for its ability to remove misleading information, allowing for a more meaningful clustering of large weighted networks.
In this paper, we examine the contribution of Network Science journal to the network science discipline. We do so from two perspectives. First, expanding the existing taxonomy of article contribution, we examine trends in theory testing, theory building, and new method development within the journal’s articles. We find that the journal demands a high level of theoretical contribution and methodological rigor. High levels of theoretical and methodological contribution become significant predictors of article citation rates. Second, we look at the composition of the studies in Network Science and determine that the journal has already established a solid “hard core” for the new discipline.
This paper introduces historical aspects of the concepts correspondence and coherence with emphasis on the nineteenth century when key aspects of modern science were emerging. It is not intended to be a definitive history of the concepts of correspondence and coherence as they have been used across the centuries in the field of inquiry that we now call science. Rather it is a brief history that highlights the apparent origins of the concepts and provides a discussion of how these concepts contributed to two important science related controversies. The first relates to aspects of evolution in which correspondence and coherence, as competing theories of truth, played a central role. The controversy about evolution continues into the beginning of the twenty-first century in forms that are recognizably similar to those of the middle of the nineteenth century. The second controversy relates to the etiology of blood-born infections (sepsis) during childbirth (childbed fever). In addition to correspondence and coherence, the authors introduce other theories of truth and discuss an evolutionarily cogent theory of truth, the pragmatic theory of truth.
Archival charcoal tree-ring segments from the Mississippian center of Kincaid Mounds provide chronometric information for the history of this important site. However, charcoal recovered from Kincaid was originally treated with a paraffin consolidant, a once common practice in American archaeology. This paper presents data on the efficacy of a solvent pretreatment protocol and new wiggle-matched 14C dates from the largest mound (Mound 10) at Kincaid. FTIR and 14C analysis on known-age charcoal intentionally contaminated with paraffin, as well as archaeological material, show that a chloroform pretreatment is effective at removing paraffin contamination. Wiggle-matched cutting dates from the final construction episodes on Mound 10 at Kincaid, indicate that the mound was used in the late 1300s with the construction of a unique structure on the apex occurring around 1390. This study demonstrates the potential for museum collections of archaeological charcoal to contribute high-resolution chronological information despite past conservation practices that complicate 14C dating.