We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Behavioral economics has grown significantly in importance and prevalence within the economics profession over the last couple of decades. Most economics departments now include researchers conducting behavioral research, and most economics journals regularly publish behavioral work.
Behavioral economics is generally defined as using evidence and constructs from neighboring social sciences, especially about limits on computation, willpower, and self-interest, to inform economic analysis (e.g., Camerer and Loewenstein, 2003). While many of these constructs come from psychology, other social sciences have much to contribute as well (see Weber and Dawes, 2005). For instance, anthropological research has provided important insights into the understanding of how social institutions and interactions shape strategic behavior (see Henrich et al., 2001).
Previous studies have shown that simply knowing one player moves first can affect behavior in games, even when the first-mover's moves are known to be unobservable. This observation violates the game-theoretic principle that timing of unobserved moves is irrelevant, but is consistent with virtual observability, a theory of how timing can matter without the ability to observe actions. However, this previous research only shows that timing matters in games where knowledge that one player moved first can help select that player's preferred equilibrium, presenting an alternative explanation to virtual observability. We extend this work by varying timing of unobservable moves in ultimatum bargaining games and “weak link” coordination games. In the latter, the equilibrium selection explanation does not predict any change in behavior due to timing differences. We find that timing without observability affects behavior in both games, but not substantially.
Duchenne muscular dystrophy is a devastating neuromuscular disorder characterized by the loss of dystrophin, inevitably leading to cardiomyopathy. Despite publications on prophylaxis and treatment with cardiac medications to mitigate cardiomyopathy progression, gaps remain in the specifics of medication initiation and optimization.
Method:
This document is an expert opinion statement, addressing a critical gap in cardiac care for Duchenne muscular dystrophy. It provides thorough recommendations for the initiation and titration of cardiac medications based on disease progression and patient response. Recommendations are derived from the expertise of the Advance Cardiac Therapies Improving Outcomes Network and are informed by established guidelines from the American Heart Association, American College of Cardiology, and Duchenne Muscular Dystrophy Care Considerations. These expert-derived recommendations aim to navigate the complexities of Duchenne muscular dystrophy-related cardiac care.
Results:
Comprehensive recommendations for initiation, titration, and optimization of critical cardiac medications are provided to address Duchenne muscular dystrophy-associated cardiomyopathy.
Discussion:
The management of Duchenne muscular dystrophy requires a multidisciplinary approach. However, the diversity of healthcare providers involved in Duchenne muscular dystrophy can result in variations in cardiac care, complicating treatment standardization and patient outcomes. The aim of this report is to provide a roadmap for managing Duchenne muscular dystrophy-associated cardiomyopathy, by elucidating timing and dosage nuances crucial for optimal therapeutic efficacy, ultimately improving cardiac outcomes, and improving the quality of life for individuals with Duchenne muscular dystrophy.
Conclusion:
This document seeks to establish a standardized framework for cardiac care in Duchenne muscular dystrophy, aiming to improve cardiac prognosis.
In responding to a Chemical, Biological, Radiological, and Nuclear explosive (CBRNe) disaster, clinical leaders have important decision-making responsibilities which include implementing hospital disaster protocols or incident command systems, managing staffing, and allocating resources. Despite emergency care clinical leaders’ integral role, there is minimal literature regarding the strategies they may use during CBRNe disasters. The aim of this study was to explore emergency care clinical leaders’ strategies related to managing patients following a CBRNe disaster.
Methods
Focus groups across 5 tertiary hospitals and 1 rural hospital in Queensland, Australia. Thirty-six hospital clinical leaders from the 6 study sites crucial to hospital disaster response participated in 6 focus groups undertaken between February and May 2021 that explored strategies and decision making to optimize patient care following a CBRNe disaster.
Results
Analysis revealed the use of rehearsals, adopting new models of care, enacting current surge management processes, and applying organization lessons were facilitating strategies. Barriers to management were identified, including resource constraints and sites operating over capacity.
Conclusions
Enhanced education and training of clinical leaders, flexible models of care, and existing established processes and tested frameworks could strengthen a hospital’s response when managing patients following a CBRNe disaster.
Efficient evidence generation to assess the clinical and economic impact of medical therapies is critical amid rising healthcare costs and aging populations. However, drug development and clinical trials remain far too expensive and inefficient for all stakeholders. On October 25–26, 2023, the Duke Clinical Research Institute brought together leaders from academia, industry, government agencies, patient advocacy, and nonprofit organizations to explore how different entities and influencers in drug development and healthcare can realign incentive structures to efficiently accelerate evidence generation that addresses the highest public health needs. Prominent themes surfaced, including competing research priorities and incentives, inadequate representation of patient population in clinical trials, opportunities to better leverage existing technology and infrastructure in trial design, and a need for heightened transparency and accountability in research practices. The group determined that together these elements contribute to an inefficient and costly clinical research enterprise, amplifying disparities in population health and sustaining gaps in evidence that impede advancements in equitable healthcare delivery and outcomes. The goal of addressing the identified challenges is to ultimately make clinical trials faster, more inclusive, and more efficient across diverse communities and settings.
We conducted a quantitative analysis of the microbial burden and prevalence of epidemiologically important pathogens (EIP) found on long-term care facilities (LTCF) environmental surfaces.
Methods:
Microbiological samples were collected using Rodac plates (25cm2/plate) from resident rooms and common areas in five LTCFs. EIP were defined as MRSA, VRE, C. difficile and multidrug-resistant (MDR) Gram-negative rods (GNRs).
Results:
Rooms of residents with reported colonization had much greater EIP counts per Rodac (8.32 CFU, 95% CI 8.05, 8.60) than rooms of non-colonized residents (0.78 CFU, 95% CI 0.70, 0.86). Sixty-five percent of the resident rooms and 50% of the common areas were positive for at least one EIP. If a resident was labeled by the facility as colonized with an EIP, we only found that EIP in 30% of the rooms. MRSA was the most common EIP recovered, followed by C. difficile and MDR-GNR.
Discussion:
We found frequent environmental contamination with EIP in LTCFs. Colonization status of a resident was a strong predictor of higher levels of EIP being recovered from his/her room.
Inpatient behavioral health units (BHUs) had unique challenges in implementing interventions to mitigate coronavirus disease 2019 (COVID-19) transmission, in part due to socialization in BHU settings. The objective of this study was to identify the transmission routes and the efficacy of the mitigation strategies employed during a COVID-19 outbreak in an inpatient BHU during the Omicron surge from December 2021 to January 2022.
Methods:
An outbreak investigation was performed after identifying 2 COVID-19-positive BHU inpatients on December 16 and 20, 2021. Mitigation measures involved weekly point prevalence testing for all inpatients, healthcare workers (HCWs), and staff, followed by infection prevention mitigation measures and molecular surveillance. Whole-genome sequencing on a subset of COVID-19-positive individuals was performed to identify the outbreak source. Finally, an outbreak control sustainability plan was formulated for future BHU outbreak resurgences.
Results:
We identified 35 HCWs and 8 inpatients who tested positive in the BHU between December 16, 2021, and January 17, 2022. We generated severe acute respiratory coronavirus virus 2 (SARS-CoV-2) genomes from 15 HCWs and all inpatients. Phylogenetic analyses revealed 3 distinct but genetically related clusters: (1) an HCW and inpatient outbreak likely initiated by staff, (2) an HCW and inpatient outbreak likely initiated by an inpatient visitor, and (3) an HCW-only cluster initiated by staff.
Conclusions:
Distinct transmission clusters are consistent with multiple, independent SARS-CoV-2 introductions with further inpatient transmission occurring in communal settings. The implemented outbreak control plan comprised of enhanced personal protective equipment requirements, limited socialization, and molecular surveillance likely minimized disruptions to patient care as a model for future pandemics.
Structural and diffraction criteria for distinguishing between t-1M, c-1M, m-1M, and 3T illite varieties are described. The t-1M illite corresponds to a one-layer monoclinic structure with vacant transsites. The c-1M illite has vacant cis-octahedra forming one of two symmetrically independent point systems; the other cis-octahedra as well as the trans-octahedra are occupied; and the m-1M illite corresponds to the structure in which cations are statistically distributed over available trans- and cis-sites. For t-1M, c-1M, and m-1M, the values of |c cos β/a| are equal to 0.39–0.41, 0.29–0.31, and 0.333, respectively. Application of these criteria demonstrates that illite samples described in the literature as the 3T polytype usually are c-1M instead. The relatively common occurrence of c-1M illite in association with t-1M and 2M1 polytypes has been recognized in illite from hydrothermal alterations around uranium deposits located in the Athabasca basement (Saskatchewan, Canada). The c-1M illite from these deposits was previously described as 3T one.
We performed a literature review to describe the risk of surgical-site infection (SSI) in minimally invasive surgery (MIS) compared to standard open surgery. Most studies reported decreased SSI rates among patients undergoing MIS compared to open procedures. However, many were observational studies and may have been affected by selection bias. MIS is associated with reduced risk of surgical-site infection compared to standard open surgery and should be considered when feasible.
Viruses are the most numerically abundant biological entities on Earth. As ubiquitous replicators of molecular information and agents of community change, viruses have potent effects on the life on Earth, and may play a critical role in human spaceflight, for life-detection missions to other planetary bodies and planetary protection. However, major knowledge gaps constrain our understanding of the Earth's virosphere: (1) the role viruses play in biogeochemical cycles, (2) the origin(s) of viruses and (3) the involvement of viruses in the evolution, distribution and persistence of life. As viruses are the only replicators that span all known types of nucleic acids, an expanded experimental and theoretical toolbox built for Earth's viruses will be pivotal for detecting and understanding life on Earth and beyond. Only by filling in these knowledge and technical gaps we will obtain an inclusive assessment of how to distinguish and detect life on other planetary surfaces. Meanwhile, space exploration requires life-support systems for the needs of humans, plants and their microbial inhabitants. Viral effects on microbes and plants are essential for Earth's biosphere and human health, but virus–host interactions in spaceflight are poorly understood. Viral relationships with their hosts respond to environmental changes in complex ways which are difficult to predict by extrapolating from Earth-based proxies. These relationships should be studied in space to fully understand how spaceflight will modulate viral impacts on human health and life-support systems, including microbiomes. In this review, we address key questions that must be examined to incorporate viruses into Earth system models, life-support systems and life detection. Tackling these questions will benefit our efforts to develop planetary protection protocols and further our understanding of viruses in astrobiology.
To evaluate the impact of a diagnostic stewardship intervention on Clostridioides difficile healthcare-associated infections (HAI).
Design:
Quality improvement study.
Setting:
Two urban acute care hospitals.
Interventions:
All inpatient stool testing for C. difficile required review and approval prior to specimen processing in the laboratory. An infection preventionist reviewed all orders daily through chart review and conversations with nursing; orders meeting clinical criteria for testing were approved, orders not meeting clinical criteria were discussed with the ordering provider. The proportion of completed tests meeting clinical criteria for testing and the primary outcome of C. difficile HAI were compared before and after the intervention.
Results:
The frequency of completed C. difficile orders not meeting criteria was lower [146 (7.5%) of 1,958] in the intervention period (January 10, 2022–October 14, 2022) than in the sampled 3-month preintervention period [26 (21.0%) of 124; P < .001]. C. difficile HAI rates were 8.80 per 10,000 patient days prior to the intervention (March 1, 2021–January 9, 2022) and 7.69 per 10,000 patient days during the intervention period (incidence rate ratio, 0.87; 95% confidence interval, 0.73–1.05; P = .13).
Conclusions:
A stringent order-approval process reduced clinically nonindicated testing for C. difficile but did not significantly decrease HAIs.
Individual differences in decision making are a topic of longstanding interest, but often yield inconsistent and contradictory results. After providing an overview of individual difference measures that have commonly been used in judgment and decision-making (JDM) research, we suggest that our understanding of individual difference effects in JDM may be improved by amending our approach to studying them. We propose four recommendations for improving the pursuit of individual differences in JDM research: a more systematic approach; more theory-driven selection of measures; a reduced emphasis on main effects in favor of interactions between individual differences and decision features, situational factors, and other individual differences; and more extensive communication of results (whether significant or null, published or unpublished). As a first step, we offer our database—the Decision Making Individual Differences Inventory (DMIDI; http://html://www.sjdm.org/dmidi), a free, public resource that categorizes and describes the most common individual difference measures used in JDM research.
In two studies, time preferences for financial gains and losses at delays of up to 50 years were elicited using three different methods: matching, fixed-sequence choice titration, and a dynamic “staircase” choice method. Matching was found to create fewer demand characteristics and to produce better fits with the hyperbolic model of discounting. The choice-based measures better predicted real-world outcomes such as smoking and payment of credit card debt. No consistent advantages were found for the dynamic staircase method over fixed-sequence titration.
Studying group decision-making is challenging for multiple reasons. An important logistic difficulty is studying a sufficiently large number of groups, each with multiple participants. Assembling groups online could make this process easier and also provide access to group members more representative of real-world work groups than the sample of college students that typically comprise lab Face-to-Face (FtF) groups. The main goal of this paper is to compare the decisions of online groups to those of FtF groups. We did so in a study that manipulated gain/loss framing of a risky decision between groups and examined the decisions of both individual group members and groups. All of these dependent measures are compared for an online and an FtF sample. Our results suggest that web-conferencing can be a substitute for FtF interaction in group decision-making research, as we found no moderation effects of communication medium on individual or group decision outcome variables. The effects of medium that were found suggest that the use of online groups may be the preferred method for group research. To wit, discussions among the online groups were shorter, but generated a greater number of thought units, i.e., they made more efficient use of time.
In recent years, evidence has started piling up that some high-energy cosmic neutrinos can be associated with blazars in flaring states. On 2022 February 26, a new blazar-neutrino coincidence was reported: the track-like neutrino event IC220225A detected by IceCube is spatially coincident with the flat-spectrum radio quasar PKS 0215+015. Like previous associations, this source was found to be in a high optical and γ-ray state. Moreover, the source showed a bright radio outburst, which substantially increases the probability of a true physical association. We have performed six observations with the VLBA shortly after the neutrino event with a monthly cadence and are monitoring the source with the Effelsberg 100m-Telescope, and with the Australia Compact Telescope Array. Here, we present first results on the contemporary parsec-scale jet structure of PKS 0215+015 in total intensity and polarization to constrain possible physical processes leading to neutrino emission in blazars.
The design and the early commissioning of the ELI-Beamlines laser facility’s 30 J, 30 fs, 10 Hz HAPLS (High-repetition-rate Advanced Petawatt Laser System) beam transport (BT) system to the P3 target chamber are described in detail. It is the world’s first and with 54 m length, the longest distance high average power petawatt (PW) BT system ever built. It connects the HAPLS pulse compressor via the injector periscope with the 4.5 m diameter P3 target chamber of the plasma physics group in hall E3. It is the largest target chamber of the facility and was connected first to the BT system. The major engineering challenges are the required high vibration stability mirror support structures, the high pointing stability optomechanics as well as the required levels for chemical and particle cleanliness of the vacuum vessels to preserve the high laser damage threshold of the dielectrically coated high-power mirrors. A first commissioning experiment at low pulse energy shows the full functionality of the BT system to P3 and the novel experimental infrastructure.
We evaluated the ability of an ultraviolet-C (UV-C) room decontamination device to kill Candida auris and C. albicans. With an organic challenge (fetal calf serum), the UV-C device demonstrated the following log10 reductions for C. auris of 4.57 and for C. albicans of 5.26 with direct line of sight, and log10 reductions for C. auris of 2.41 and for C. ablicans of 3.96 with indirect line of sight.
We evaluated the robustness of sterilization technologies when spores and bacteria were placed on “dirty” instruments and overlaid with blood. The results illustrate that steam sterilization is the most effective sterilization technology with the largest margin of safety, followed by ethylene oxide and hydrogen peroxide gas plasma.
Anxiety disorders are among the most common psychiatric disorders worldwide. They often onset early in life, with symptoms and consequences that can persist for decades. This makes anxiety disorders some of the most debilitating and costly disorders of our time. Although much is known about the synaptic and circuit mechanisms of fear and anxiety, research on the underlying genetics has lagged behind that of other psychiatric disorders. However, alongside the formation of the Psychiatric Genomic Consortium Anxiety workgroup, progress is rapidly advancing, offering opportunities for future research.
Here we review current knowledge about the genetics of anxiety across the lifespan from genetically informative designs (i.e. twin studies and molecular genetics). We include studies of specific anxiety disorders (e.g. panic disorder, generalised anxiety disorder) as well as those using dimensional measures of trait anxiety. We particularly address findings from large-scale genome-wide association studies and show how such discoveries may provide opportunities for translation into improved or new therapeutics for affected individuals. Finally, we describe how discoveries in anxiety genetics open the door to numerous new research possibilities, such as the investigation of specific gene–environment interactions and the disentangling of causal associations with related traits and disorders.
We discuss how the field of anxiety genetics is expected to move forward. In addition to the obvious need for larger sample sizes in genome-wide studies, we highlight the need for studies among young people, focusing on specific underlying dimensional traits or components of anxiety.