We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To understand healthcare workers’ (HCWs) beliefs and practices toward blood culture (BCx) use.
Design:
Cross-sectional electronic survey and semi-structured interviews.
Setting:
Academic hospitals in the United States.
Participants:
HCWs involved in BCx ordering and collection in adult intensive care units (ICU) and wards.
Methods:
We administered an anonymous electronic survey to HCWs and conducted semi-structured interviews with unit staff and quality improvement (QI) leaders in these institutions to understand their perspectives regarding BCx stewardship between February and November 2023.
Results:
Of 314 HCWs who responded to the survey, most (67.4%) were physicians and were involved in BCx ordering (82.3%). Most survey respondents reported that clinicians had a low threshold to culture patients for fever (84.4%) and agreed they could safely reduce the number of BCx obtained in their units (65%). However, only half of them believed BCx was overused. Although most made BCx decisions as a team (74.1%), a minority reported these team discussions occurred daily (42.4%). A third of respondents reported not usually collecting the correct volume per BCx bottle, half were unaware of the improved sensitivity of 2 BCx sets, and most were unsure of the nationally recommended BCx contamination threshold (87.5%). Knowledge regarding the utility of BCx for common infections was limited.
Conclusions:
HCWs’ understanding of best collection practices and yield of BCx was limited.
Changing practice patterns caused by the pandemic have created an urgent need for guidance in prescribing stimulants using telepsychiatry for attention-deficit hyperactivity disorder (ADHD). A notable spike in the prescribing of stimulants accompanied the suspension of the Ryan Haight Act, allowing the prescribing of stimulants without a face-to-face meeting. Competing forces both for and against prescribing ADHD stimulants by telepsychiatry have emerged, requiring guidelines to balance these factors. On the one hand, factors weighing in favor of increasing the availability of treatment for ADHD via telepsychiatry include enhanced access to care, reduction in the large number of untreated cases, and prevention of the known adverse outcomes of untreated ADHD. On the other hand, factors in favor of limiting telepsychiatry for ADHD include mitigating the possibility of exploiting telepsychiatry for profit or for misuse, abuse, and diversion of stimulants. This Expert Consensus Group has developed numerous specific guidelines and advocates for some flexibility in allowing telepsychiatry evaluations and treatment without an in-person evaluation to continue. These guidelines also recognize the need to give greater scrutiny to certain subpopulations, such as young adults without a prior diagnosis or treatment of ADHD who request immediate-release stimulants, which should increase the suspicion of possible medication diversion, misuse, or abuse. In such cases, nonstimulants, controlled-release stimulants, or psychosocial interventions should be prioritized. We encourage the use of outside informants to support the history, the use of rating scales, and having access to a hybrid model of both in-person and remote treatment.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
Organizational actors spend a tremendous amount of time and energy trying to intentionally change their routines. We conceptualize these intentional changes as routine design—intentional efforts to change one or more aspects of a routine to create a preferred situation. We review existing routines research on intentional change by showing how different perspectives on routines have generated different insights about the relationship between intentional change and design. We highlight a cognitive perspective, a practice perspective, and an ontological process perspective on routine design. We then draw on two perspectives inspired by design studies. Simon’s scientific perspective on design suggests that routines scholars study the effects and implications of designing artifacts. Schön’s reflective practice perspective on design suggests that routines scholars can examine how actors set the problem, engage in (re)framing, and in reflection-in-action. These design studies perspectives offer routines scholars a better understanding of efforts to intentionally change routines. Based on these insights from design studies, we develop a future research agenda for routine design.
Organizations increasingly rely upon algorithms to change their routines—with positive, negative, or messy outcomes. In this chapter, we argue that conceptualizing algorithms as an integral part of an assemblage provides scholars with the ability to generate novel theories about how algorithms influence routine dynamics. First, we review existing research that shows how algorithms operate as an actant making decisions; encode the intentions of designers; are entangled in broader assemblages of theories, artifacts, actors, and practices; and generate performative effects. Second, we elucidate five analytical approaches that can help management scholars to identify new connections between routine assemblages, their elements, and organizational outcomes. Finally, we outline directions for future research to explore how studying algorithms can advance our understanding of routine dynamics and how a routine dynamics perspective can contribute to the understanding of algorithms in strategy and organizational theory more broadly .
Small islands can guide visualization of the diverse information requirements of future context-relevant coastal governance. On small marine islands (<20 000 km2), negative effects of coastal challenges (e.g., related to population growth, unsustainable resource use or climate change) can develop rapidly, with high intensity and extreme impacts. The smallest and most remote islands within small-island states and small islands in larger states can be threatened by intrinsic governance factors, typically resulting in access to fewer resources than larger islands or administrative centres. For these reasons, efforts to support coastal change governance are critical and need to be targeted. We propose a conceptual framework that distinguishes key governance-related components of small-island social–ecological systems (SESs). To prioritize areas of vulnerability and opportunity, physical, ecological, social, economic and governance attributes are visualized to help show the ability of different types of small-island SESs to adapt, or be transformed, in the face of global and local change. Application of the framework to an Indonesian archipelago illustrates examples of local rule enforcement supporting local self-organized marine governance. Visualization of complex and interconnected social, environmental and economic changes in small-island SESs provides a better understanding of the vulnerabilities and opportunities related to context-specific governance.
Because the Anthropocene by definition is an epoch during which environmental change is largely anthropogenic and driven by social, economic, psychological and political forces, environmental social scientists can effectively analyse human behaviour and knowledge systems in this context. In this subject review, we summarize key ways in which the environmental social sciences can better inform fisheries management policy and practice and marine conservation in the Anthropocene. We argue that environmental social scientists are particularly well positioned to synergize research to fill the gaps between: (1) local behaviours/needs/worldviews and marine resource management and biological conservation concerns; and (2) large-scale drivers of planetary environmental change (globalization, affluence, technological change, etc.) and local cognitive, socioeconomic, cultural and historical processes that shape human behaviour in the marine environment. To illustrate this, we synthesize the roles of various environmental social science disciplines in better understanding the interaction between humans and tropical marine ecosystems in developing nations where issues arising from human–coastal interactions are particularly pronounced. We focus on: (1) the application of the environmental social sciences in marine resource management and conservation; (2) the development of ‘new’ socially equitable marine conservation; (3) repopulating the seascape; (4) incorporating multi-scale dynamics of marine social–ecological systems; and (5) envisioning the future of marine resource management and conservation for producing policies and projects for comprehensive and successful resource management and conservation in the Anthropocene.
Associations between employment status and mental health are well recognised, but evidence is sparse on the relationship between paid employment and mental health in the years running up to statutory retirement ages using robust mental health measures. In addition, there has been no investigation into the stability over time in this relationship: an important consideration if survey findings are used to inform future policy. The aim of this study is to investigate the association between employment status and common mental disorder (CMD) in 50–64-year old residents in England and its stability over time, taking advantage of three national mental health surveys carried out over a 14-year period.
Methods.
Data were analysed from the British National Surveys of Psychiatric Morbidity of 1993, 2000 and 2007. Paid employment status was the primary exposure of interest and CMD the primary outcome – both ascertained identically in all three surveys (CMD from the revised Clinical Interview Schedule). Multivariable logistic regression models were used.
Results.
The prevalence of CMD was higher in people not in paid employment across all survey years; however, this association was only present for non-employment related to poor health as an outcome and was not apparent in those citing other reasons for non-employment. Odds ratios for the association between non-employment due to ill health and CMD were 3.05 in 1993, 3.56 in 2000, and 2.80 in 2007, after adjustment for age, gender, marital status, education, social class, housing tenure, financial difficulties, smoking status, recent physical health consultation and activities of daily living impairment.
Conclusions.
The prevalence of CMD was higher in people not in paid employment for health reasons, but was not associated with non-employment for other reasons. Associations had been relatively stable in strength from 1993 to 2007 in those three cross-sectional nationally representative samples.
Reliable and precise ages of Quaternary pedogenic carbonate can be obtained with 230Th/U dating by thermal ionization mass spectrometry applied to carefully selected milligram-size samples. Datable carbonate can form within a few thousand years of surface stabilization allowing ages of Quaternary deposits and surfaces to be closely estimated. Pedogenic carbonate clast-rinds from gravels of glacio-fluvial terraces in the Wind River Basin have median concentrations of 14 ppm U and 0.07 ppm 232Th, with median (230Th/232Th) = 270, making them well suited for 230Th/U dating. Horizons as thin as 0.5 mm were sampled from polished slabs to reduce averaging of long (≥105 yr), and sometimes visibly discontinuous, depositional histories. Dense, translucent samples with finite 230Th/U ages preserve within-rind stratigraphic order in all cases. Ages for terraces WR4 (167,000 ± 6,400 yr) and WR2 (55,000 ± 8600 yr) indicate a mean incision rate of 0.26 ± 0.05 m per thousand years for the Wind River over the past glacial cycle, slower than inferred from cosmogenic-nuclide dating. Terrace WR3, which formed penecontemporaneously with the final maximum glacial advance of the penultimate Rocky Mountain (Bull Lake) glaciation, has an age of 150,000 ± 8300 yr indicating that it is broadly synchronous with the penultimate global ice volume maximum.
Many adults with autism spectrum disorder (ASD) remain undiagnosed. Specialist assessment clinics enable the detection of these cases, but such services are often overstretched. It has been proposed that unnecessary referrals to these services could be reduced by prioritizing individuals who score highly on the Autism-Spectrum Quotient (AQ), a self-report questionnaire measure of autistic traits. However, the ability of the AQ to predict who will go on to receive a diagnosis of ASD in adults is unclear.
Method
We studied 476 adults, seen consecutively at a national ASD diagnostic referral service for suspected ASD. We tested AQ scores as predictors of ASD diagnosis made by expert clinicians according to International Classification of Diseases (ICD)-10 criteria, informed by the Autism Diagnostic Observation Schedule-Generic (ADOS-G) and Autism Diagnostic Interview-Revised (ADI-R) assessments.
Results
Of the participants, 73% received a clinical diagnosis of ASD. Self-report AQ scores did not significantly predict receipt of a diagnosis. While AQ scores provided high sensitivity of 0.77 [95% confidence interval (CI) 0.72–0.82] and positive predictive value of 0.76 (95% CI 0.70–0.80), the specificity of 0.29 (95% CI 0.20–0.38) and negative predictive value of 0.36 (95% CI 0.22–0.40) were low. Thus, 64% of those who scored below the AQ cut-off were ‘false negatives’ who did in fact have ASD. Co-morbidity data revealed that generalized anxiety disorder may ‘mimic’ ASD and inflate AQ scores, leading to false positives.
Conclusions
The AQ's utility for screening referrals was limited in this sample. Recommendations supporting the AQ's role in the assessment of adult ASD, e.g. UK NICE guidelines, may need to be reconsidered.
Accurate data on the incidence of West Nile virus (WNV) disease are important for directing public health education and control activities. The objective of this project was to assess the underdiagnosis of WNV neuroinvasive disease through laboratory testing of patients with suspected viral meningitis or encephalitis at selected hospitals serving WNV-endemic regions in three states. Of the 279 patients with cerebrospinal fluid (CSF) specimens tested for WNV immunoglobulin M (IgM) antibodies, 258 (92%) were negative, 19 (7%) were positive, and two (1%) had equivocal results. Overall, 63% (12/19) of patients with WNV IgM-positive CSF had WNV IgM testing ordered by their attending physician. Seven (37%) cases would not have been identified as probable WNV infections without the further testing conducted through this project. These findings indicate that over a third of WNV infections in patients with clinically compatible neurological illness might be undiagnosed due to either lack of testing or inappropriate testing, leading to substantial underestimates of WNV neuroinvasive disease burden. Efforts should be made to educate healthcare providers and laboratorians about the local epidemiology of arboviral diseases and the optimal tests to be used in different clinical situations.
How can Japan put its past behind? Scholars, journalists, and activists frequently argue that Japan cannot solve its “history problem” unless it follows West Germany's lead in offering contrition for World War II violence. Into this debate, Jennifer Lind's Sorry States: Apologies in International Politics offers an original and provocative contribution. Lind argues that while countries should acknowledge past atrocities, frequent public apologies can be domestically polarizing and diplomatically counterproductive. Sorry States outlines a theory of remembrance and threat perception and tests it in a comparative study of Japanese-South Korean and Franco-German relations after World War II. Its methods, data, and findings will interest not only East Asianists, but also scholars of international reconciliation and security studies more broadly. This roundtable presents three critical essays in addition to a response by the author. They discuss the mechanisms through which historical memory influences perceptions of threat, the relative weight of ideational versus material factors in threat perception, and whether changes in international norms and economic interdependence may increasingly pressure countries to confront past violence.
Velo-cardio-facial syndrome (VCFS) is associated with deletions at chromosome 22q11, abnormalities in brain anatomy and function, and schizophrenia-like psychosis. Thus it is assumed that one or more genes within the deleted region are crucial to brain development. However, relatively little is known about how genetic variation at 22q11 affects brain structure and function. One gene on 22q11 is catechol-O-methyltransferase (COMT): an enzyme that degrades dopamine and contains a functional polymorphism (Val158Met) affecting enzyme activity. Here, we investigated the effect of COMT Val158Met polymorphism on brain anatomy and cognition in adults with VCFS.
Method
The COMT Val158Met polymorphism was genotyped for 26 adults with VCFS on whom DNA was available. We explored its effects on regional brain volumes using hand tracing approaches; on regional grey- and white-matter density using computerized voxel-based analyses; and measures of attention, IQ, memory, executive and visuospatial function using a comprehensive neuropsychological test battery.
Results
After corrections for multiple comparisons Val-hemizygous subjects, compared with Met-hemizygotes, had a significantly larger volume of frontal lobes. Also, Val-hemizygotes had significantly increased grey matter density in cerebellum, brainstem, and parahippocampal gyrus, and decreased white matter density in the cerebellum. No significant effects of COMT genotype on neurocognitive performance were found.
Conclusions
COMT genotype effects on brain anatomy in VCFS are not limited to frontal regions but also involve other structures previously implicated in VCFS. This suggests variation in COMT activity is implicated in brain development in VCFS.
Salmonella Goldcoast (SGC), an uncommon serotype in Germany, was identified in 25 isolates between 1 April and 7 May 2001. To determine the cause of the outbreak, we conducted a matched case-control study including 24 cases and 51 controls. In a multivariable regression model, only consumption of a raw fermented sausage manufactured by a local company remained significant (adjusted odds ratio 20·0, 95% confidence interval 2·7–302·5). SGC isolated from case-patients shared an indistinguishable pulsed-field gel electrophoresis pattern. A part of the produced raw fermented sausage was sold after only 4 days of fermentation. Samples from the premises and products of the company were negative for SGC. However, short-time raw fermented sausage is more likely to contain pathogens. Irradiation of raw ingredients is not accepted by German consumers, thus strict adherence to good manufacturing practices, the use of HACCP programmes as well as on-farm programmes remain crucial to reduce Salmonella.