We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A truly unique all-embracing narrative of the American war in Afghanistan from the own words of its architects. Choosing Defeat takes an unparalleled inside look at America's longest war, pulling back the curtain on the inner deliberations behind the scenes. The author combines his own extensive experience in the Army, the CIA, and the White House, with interviews from policymakers within the Bush, Obama, and Trump administrations, to produce a groundbreaking study of how American leaders make wartime decisions. Transporting you inside the White House Situation Room, every key strategic debate over twenty years – from the immediate aftermath of 9/11, to Obama's surge and withdrawal, to Trump's negotiations with the Taliban, and Biden's final pullout is carefully reconstructed. Paul D. Miller identifies issues in US leadership, governance, military strategy, and policymaking that extend beyond the war in Afghanistan and highlight the existence of deeper problems in American foreign policy.
Tackling climate change requires long-term commitment to action, yet an array of influential parties with vested interests stand opposed to this. How best to engage and balance these positions for positive change is of increasing concern for advocates and policy makers. Exploring a discord within climate change policy and politics, this insightful volume critically examines the competing assumptions and arguments underpinning political 'stability' versus 're/politicization' as a means of securing effective, long-term climate action. A range of cases exemplify the different political systems and power structures that underpin this antagonism, spanning geographical approaches, examples of non-governmental action, and key industries in the global economy. Authored by an international team of scholars, this book will be of interest to researchers of local, national, and international legislation, specialists on climate governance policy, and other scholars involved in climate action. This title is also available as Open Access on Cambridge Core.
Between 1894 and 1936, Imperial Japan fought several “small wars” against Tonghak Rebels, Taiwanese millenarians, Korean Righteous Armies, Germans in Shandong, Taiwan Indigenous Peoples, and “bandits” in Manchuria. Authoritative accounts of Japanese history ignore these wars, or sanitize them as “seizures,” “cessions,” or occasions for diplomatic maneuvers. The consigning the empire's “small wars” to footnotes (at best) has in turn promoted a view that Japanese history consists of alternating periods of “peacetime” (constitutionalism) and “wartime” (militarism), in accord with the canons of liberal political theory. However, the co-existence of “small wars” with imperial Japan's iconic wars indicates that Japan was a nation at war from 1894 through 1945. Therefore, the concept “Forever War” recommends itself for thinking about militarism and democracy as complementary formations, rather than as opposed forces. The Forever-War approach emphasizes lines of continuity that connect “limited wars” (that mobilized relatively few Japanese soldiers and civilians, but were nonetheless catastrophic for the colonized and occupied populations on the ground) with “total wars” (that mobilized the whole Japanese nation against the Qing, imperial Russia, nationalist China, and the United States). The steady if unspectacular operations of Forever War– armed occupations, settler colonialism, military honor-conferral events, and annual ceremonies at Yasukuni Shrine–continued with little interruption even during Japan's golden age of democracy and pacifism in the 1920s. This article argues that Forever War laid the infrastructural groundwork for “total war” in China from 1937 onwards, while it produced a nation of decorated, honored, and mourned veterans, in whose names the existing empire was defended at all costs against the United States in the 1940s. In Forever War—whether in imperial Japan or elsewhere–soldiering and military service become ends in themselves, and “supporting the troops” becomes part of unthinking, common sense.
To improve early intervention and personalise treatment for individuals early on the psychosis continuum, a greater understanding of symptom dynamics is required. We address this by identifying and evaluating the movement between empirically derived attenuated psychotic symptomatic substates—clusters of symptoms that occur within individuals over time.
Methods
Data came from a 90-day daily diary study evaluating attenuated psychotic and affective symptoms. The sample included 96 individuals aged 18–35 on the psychosis continuum, divided into four subgroups of increasing severity based on their psychometric risk of psychosis, with the fourth meeting ultra-high risk (UHR) criteria. A multilevel hidden Markov modelling (HMM) approach was used to characterise and determine the probability of switching between symptomatic substates. Individual substate trajectories and time spent in each substate were subsequently assessed.
Results
Four substates of increasing psychopathological severity were identified: (1) low-grade affective symptoms with negligible psychotic symptoms; (2) low levels of nonbizarre ideas with moderate affective symptoms; (3) low levels of nonbizarre ideas and unusual thought content, with moderate affective symptoms; and (4) moderate levels of nonbizarre ideas, unusual thought content, and affective symptoms. Perceptual disturbances predominantly occurred within the third and fourth substates. UHR individuals had a reduced probability of switching out of the two most severe substates.
Conclusions
Findings suggest that individuals reporting unusual thought content, rather than nonbizarre ideas in isolation, may exhibit symptom dynamics with greater psychopathological severity. Individuals at a higher risk of psychosis exhibited persistently severe symptom dynamics, indicating a potential reduction in psychological flexibility.
Over a hundred gravitational-wave signals have now been detected from the mergers of black holes and neutron stars, but other sources of gravitational waves have not yet been discovered. Some of the most violent explosive events in the Universe are predicted to emit bursts of gravitational waves and may result in the next big multi-messenger discovery. Gravitational-wave burst signals often have an unknown waveform shape and unknown gravitational-wave energy, due to unknown or very complicated progenitor astrophysics. Potential sources of gravitational-wave bursts include core-collapse supernovae, cosmic strings, fast radio bursts, eccentric binary systems, and gravitational-wave memory. In this review, we discuss the astrophysical properties of the main predicted sources of gravitational-wave bursts and the known features of their gravitational-wave emission. We summarise their future detection prospects and discuss the challenges of searching for gravitational-wave burst signals and interpreting the astrophysics of the source.
Grazing is a crucial component of dairy farms across many regions of the world. This review explores challenges related to grazing infrastructure and opportunities for future improvement. Farmers who aim to increase pasture utilisation face heightened inter-animal competition necessitated by pasture restriction to achieve target post-grazing sward heights. Increasing the frequency of fresh pasture allocation beyond once per day has been observed to reduce milk production in primiparous animals, due to intensified competition for limited feed resources. Implementing grazing paddocks tailored for 24- to 36-hour allocations helps to mitigate inter-animal competition while concurrently preventing the grazing of fresh regrowth. Crucial to this approach is establishing farm roadway infrastructure that allows access to all sections of the grazing platform. However, the development of these roadway networks has often occurred without a comprehensive assessment of their impact on the efficiency of the dairy herd’s movement between grazing paddocks and the milking parlour. The efficiency of the dairy herd’s movement is most significantly influenced by the location of the milking parlour within the grazing platform. Extreme walking distances or challenging terrain on farm roadways may have an impact on milk production per cow. Factors such as farm roadway surface quality and width significantly influence cow throughput on farm roadways. Recent studies have highlighted inadequate roadway widths on many farms relative to their herd size, while surface condition may also be limiting cow throughput on these farms. Enhancing roadway width and surface condition of farm roadways may improve labour efficiency on commercial farms.
Duchenne muscular dystrophy is a devastating neuromuscular disorder characterized by the loss of dystrophin, inevitably leading to cardiomyopathy. Despite publications on prophylaxis and treatment with cardiac medications to mitigate cardiomyopathy progression, gaps remain in the specifics of medication initiation and optimization.
Method:
This document is an expert opinion statement, addressing a critical gap in cardiac care for Duchenne muscular dystrophy. It provides thorough recommendations for the initiation and titration of cardiac medications based on disease progression and patient response. Recommendations are derived from the expertise of the Advance Cardiac Therapies Improving Outcomes Network and are informed by established guidelines from the American Heart Association, American College of Cardiology, and Duchenne Muscular Dystrophy Care Considerations. These expert-derived recommendations aim to navigate the complexities of Duchenne muscular dystrophy-related cardiac care.
Results:
Comprehensive recommendations for initiation, titration, and optimization of critical cardiac medications are provided to address Duchenne muscular dystrophy-associated cardiomyopathy.
Discussion:
The management of Duchenne muscular dystrophy requires a multidisciplinary approach. However, the diversity of healthcare providers involved in Duchenne muscular dystrophy can result in variations in cardiac care, complicating treatment standardization and patient outcomes. The aim of this report is to provide a roadmap for managing Duchenne muscular dystrophy-associated cardiomyopathy, by elucidating timing and dosage nuances crucial for optimal therapeutic efficacy, ultimately improving cardiac outcomes, and improving the quality of life for individuals with Duchenne muscular dystrophy.
Conclusion:
This document seeks to establish a standardized framework for cardiac care in Duchenne muscular dystrophy, aiming to improve cardiac prognosis.
To evaluate the impact of implementing a clinical care guideline for uncomplicated gram-negative bloodstream infections (GN-BSI) within a health system.
Design:
Retrospective, quasi-experimental study.
Setting:
A large academic safety-net institution.
Participants:
Adults (≥18 years) with GN-BSI, defined by at least one positive blood culture for specific gram-negative organisms. Patients with polymicrobial cultures or contaminants were excluded.
Interventions:
Implementation of a GN-BSI clinical care guideline based on a 2021 consensus statement, emphasizing 7-day antibiotic courses, use of highly bioavailable oral antibiotics, and minimizing repeat blood cultures.
Results:
The study included 147 patients pre-intervention and 169 post-intervention. Interrupted time series analysis showed a reduction in the median duration of therapy (–2.3 days, P = .0016), with a sustained decline (slope change –0.2103, P = .005) post-intervention. More patients received 7 days of therapy (12.9%–58%, P < .01), oral antibiotic transitions increased (57.8% vs 72.2%, P < .05), and guideline-concordant oral antibiotic selection was high. Repeat blood cultures decreased (50.3% vs 30.2%, P < .01) without an increase in recurrent bacteremia. No significant differences were observed in 90-day length of stay, rehospitalization, recurrence, or mortality.
Conclusions:
Guideline implementation was associated with shorter antibiotic therapy durations, increased use of guideline-concordant oral antibiotics, and fewer repeat blood cultures without compromising patient outcomes. These findings support the effectiveness of institutional guidelines in standardizing care, optimizing resource utilization, and promoting evidence-based practices in infectious disease management.
Growing evidence highlights the critical role of patient choice of treatment, with significant benefits for outcomes found in some studies. While four meta-analyses have previously examined the association between treatment choice and outcomes in mental health, robust conclusions have been limited by the inclusion of studies with biased preference trial designs. The current systematic review included 30 studies across three common and frequently comorbid mental health disorders (depression N = 23; anxiety, N = 5; eating disorders, N = 2) including 7055 participants (Mage 42.5 years, SD 11.7; 69.5% female). Treatment choice most often occurred between psychotherapy and antidepressant medication (43.3%), followed by choice between two different forms of psychotherapy, or elements within psychotherapy (36.7%). There were insufficient studies with stringent designs to conduct meta-analyses for anxiety or eating disorders as outcomes, or for treatment uptake. Treatment choice significantly improved outcomes for depression (d = 0.17, n = 18) and decreased therapy dropout, both in a combined sample targeting depression (n = 12), anxiety (n = 4) and eating disorders (n = 1; OR = 1.46, 95% CI: 1.17, 1.83), and in a smaller sample of the depression studies alone (OR = 1.65, 95% CI: 1.05, 2.59). All studies evaluated the impact of adults making treatment choices with none examining the effect of choice in adolescents. Clear directions in future research are indicated, in terms of designing studies that can adequately test the treatment choice and outcome association in anxiety and eating disorder treatment, and in youth.
Bathing intensive care unit (ICU) patients with chlorhexidine gluconate (CHG) decreases healthcare-associated infections (HAIs). The optimal method of CHG bathing remains undefined.
Methods:
Prospective crossover study comparing CHG daily bathing with 2% CHG-impregnated cloths versus 4% CHG solution. In phase 1, from January 2020 through March 2020, 1 ICU utilized 2% cloths, while the other ICU utilized 4% solution. After an interruption caused by the coronavirus disease 2019 pandemic, in phase 2, from July 2020 through September 2020, the unit CHG bathing assignments were reversed. Swabs were performed 3 times weekly from patients’ arms and legs to measure skin microbial colonization and CHG concentration. Other outcomes included HAIs, adverse reactions, and skin tolerability.
Results:
411 assessments occurred after baths with 2% cloth, and 425 assessments occurred after baths with 4% solution. Average microbial colonization was 691 (interquartile range 0, 30) colony-forming units per square centimeter (CFU/cm2) for patients bathed with 2% cloths, 1,627 (0, 265) CFUs/cm2 for 4% solution, and 8,519 (10, 1130) CFUs/cm2 for patients who did not have a CHG bath (P < .001). Average CHG skin concentration (parts per million) was 1300.4 (100, 2000) for 2% cloths, 307.2 (30, 200) for 4% solution, and 32.8 (0, 20) for patients without a recorded CHG bath. Both CHG bathing methods were well tolerated. Although underpowered, no difference in HAI was noted between groups.
Conclusions:
Either CHG bathing method resulted in a significant decrease in microbial skin colonization with a greater CHG concentration and fewer organisms associated with 2% CHG cloths.
The stratigraphic record of the Early Holocene in the Nebraska Sand Hills suggests dry climatic conditions and periods of sustained aeolian activity, which resulted in several well-documented instances of sand dunes blocking river drainages in the western Sand Hills. Here, we present evidence that drainage blockage by migrating sand dunes also occurred in the central Sand Hills, where precipitation is higher and dune morphology differs. The South Fork Dismal River valley contains a sequence of aeolian, alluvial, and lacustrine sediments that record a gradual rise of the local water table following a sand dune blockage of the river valley around 11,000 years ago. After the initial development of a wetland, a lake formed and persisted for at least 2000 years. Increased groundwater discharge due to a warm, moist climate in the region after 6500 years ago likely caused the breaching of the dune dam and eventually resulted in the decline of the local water table. Through a careful examination of the intricate relationships between ground water, surface water, and sand movement in a dune field setting, we discuss the hydrologic system's complex response to climate change. We use diatoms to reconstruct the lacustrine environment and optically stimulated luminescence and radiocarbon dating to provide chronological control, based on a careful evaluation of the strengths and limitations of each method in varied depositional environments.
Antarctica is populated by a diverse array of terrestrial fauna that have successfully adapted to its extreme environmental conditions. The origins and diversity of the taxa have been of continuous interest to ecologists since their discovery. Early theory considered contemporary populations as descendants of recent arrivals; however, mounting molecular evidence points to firmly established indigenous taxa far earlier than the Last Glacial Maximum, thus indicating more ancient origins. Here we present insights into Antarctica's terrestrial invertebrates by synthesizing available phylogeographic studies. Molecular dating supports ancient origins for most indigenous taxa, including Acari (up to 100 million years ago; Ma), Collembola (21–11 Ma), Nematoda (~30 Ma), Tardigrada (> 1 Ma) and Chironomidae (> 49 Ma), while Rotifera appear to be more recent colonizers (~130 Ka). Subsequent population bottlenecks and rapid speciation have occurred with limited gene transfer between Continental and Maritime Antarctica, while repeated wind- or water-borne dispersal and colonization of contiguous regions during interglacial periods shaped current distributions. Greater knowledge of Antarctica's fauna will focus conservation efforts to ensure their persistence.
In 2015, a continuous 15.4 m snow/firn core was recovered from central South Georgia Island at ∼850 m a.s.l. All firn core samples were analyzed for major (Al, Ca, Mg, Na, K, Ti and Fe) and trace element concentrations (Sr, Cd, Cs, Ba, La, Ce, Pr, Pb, Bi, U, As, Li, S, V, Cr, Mn, Co, Cu and Zn) and stable water isotopes. The chemical and isotopic signal is well preserved in the top 6.2 m of the core. Below this depth, down to the bottom of the core, signal dampening is observed in the majority of the elemental species making it difficult to distinguish a seasonal signal. Thirteen elements (As, Bi, Ca, Cd, Cu, K, Li, Mg, Na, Pb, S, Sr and Zn) have crustal enrichment factor values higher than 10 suggesting sources in addition to those found naturally in the crust. While this study shows that 850 m a.s.l. is not high enough to preserve a record including recent years, higher-elevation (>1250 m a.s.l.) glaciers may be likely candidates for ice core drilling to recover better-preserved, continuous, recent to past glaciochemical records.
Recent theories have implicated inflammatory biology in the development of psychopathology and maladaptive behaviors in adolescence, including suicidal thoughts and behaviors (STB). Examining specific biological markers related to inflammation is thus warranted to better understand risk for STB in adolescents, for whom suicide is a leading cause of death.
Method:
Participants were 211 adolescent females (ages 9–14 years; Mage = 11.8 years, SD = 1.8 years) at increased risk for STB. This study examined the prospective association between basal levels of inflammatory gene expression (average of 15 proinflammatory mRNA transcripts) and subsequent risk for suicidal ideation and suicidal behavior over a 12-month follow-up period.
Results:
Controlling for past levels of STB, greater proinflammatory gene expression was associated with prospective risk for STB in these youth. Similar effects were observed for CD14 mRNA level, a marker of monocyte abundance within the blood sample. Sensitivity analyses controlling for other relevant covariates, including history of trauma, depressive symptoms, and STB prior to data collection, yielded similar patterns of results.
Conclusions:
Upregulated inflammatory signaling in the immune system is prospectively associated with STB among at-risk adolescent females, even after controlling for history of trauma, depressive symptoms, and STB prior to data collection. Additional research is needed to identify the sources of inflammatory up-regulation in adolescents (e.g., stress psychobiology, physiological development, microbial exposures) and strategies for mitigating such effects to reduce STB.
The study deals with the problem of determining true dimensionality of data-with-error scaled by Kruskal's multidimensional scaling technique. Artificial data was constructed for 6, 8, 12, 16, and 30 point configurations of 1, 2, or 3 true dimensions by adding varying amounts of error to the true distances. Results show how stress is affected by error, number of points, and number of dimensions, and indicate that stress and the “elbow” criterion are inadequate for purposes of identifying true dimensionality when there is error in the data. The Wagenaar-Padmos procedure for identifying true dimensionality and error level is discussed. A simplified technique, involving a measure called Constraint, is suggested.
Human abilities in perceptual domains have conventionally been described with reference to a threshold that may be defined as the maximum amount of stimulation which leads to baseline performance. Traditional psychometric links, such as the probit, logit, and t, are incompatible with a threshold as there are no true scores corresponding to baseline performance. We introduce a truncated probit link for modeling thresholds and develop a two-parameter IRT model based on this link. The model is Bayesian and analysis is performed with MCMC sampling. Through simulation, we show that the model provides for accurate measurement of performance with thresholds. The model is applied to a digit-classification experiment in which digits are briefly flashed and then subsequently masked. Using parameter estimates from the model, individuals’ thresholds for flashed-digit discrimination is estimated.
The role of initial specimen diversion devices (ISDDs) in preventing contamination of central venous catheter (CVC) blood cultures is undefined. A model to simulate CVC colonization and contamination compared standard cultures with ISDD technique. ISDD detected 100% of colonized CVCs while decreasing false-positive cultures from 36% to 16%.
This agenda-setting volume disrupts conventional notions of time through a robust examination of the relations between temporality, infrastructure and urban society. With global coverage of diverse cities and regions from Berlin to Jayapura, this book re-evaluates the temporal complexities that shape our infrastructured worlds.
Intake of high quantities of dietary proteins sourced from dairy, meat or plants can affect body weight and metabolic health in humans. To improve our understanding of how this may be achieved, we reviewed the data related to the availability of nutrients and metabolites in the faeces, circulation and urine. All protein sources (≥20% by energy) increased faecal levels of branched-chain fatty acids and ammonia and decreased the levels of butyrate. Some metabolites responded to dairy and meat proteins (branched-chain amino acids) as well as dairy and plant proteins (p-cresol), which were increased in faecal matter. Specific to dairy protein intake, the faecal levels of acetate, indole and phenol were increased, whereas plant protein intake specifically increased the levels of kynurenine and tyramine. Meat protein intake increased the faecal levels of methionine, cysteine and alanine and decreased the levels of propionate and acetate. The metabolite profile in the faecal matter following dairy protein intake mirrored availability in circulation or urine. These findings provide an understanding of the contrasting gut versus systemic effects of different dietary proteins, which we know to show different physiological effects. In this regard, we provide directions to determining the mechanisms for the effects of different dietary proteins.