We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The prenatal and early-life periods pose a crucial neurodevelopmental window whereby disruptions to the intestinal microbiota and the developing brain may have adverse impacts. As antibiotics affect the human intestinal microbiome, it follows that early-life antibiotic exposure may be associated with later-life psychiatric or neurocognitive outcomes.
Aims
To explore the association between early-life (in utero and early childhood (age 0–2 years)) antibiotic exposure and the subsequent risk of psychiatric and neurocognitive outcomes.
Method
A search was conducted using Medline, PsychINFO and Excerpta Medica databases on 20 November 2023. Risk of bias was assessed using the Newcastle-Ottawa scale, and certainty was assessed using the grading of recommendations, assessment, development and evaluation (GRADE) certainty assessment.
Results
Thirty studies were included (n = 7 047 853 participants). Associations were observed between in utero antibiotic exposure and later development of autism spectrum disorder (ASD) (odds ratio 1.09, 95% CI: 1.02–1.16) and attention-deficit hyperactivity disorder (ADHD) (odds ratio 1.19, 95% CI: 1.11–1.27) and early-childhood exposure and later development of ASD (odds ratio 1.19, 95% CI: 1.01–1.40), ADHD (odds ratio 1.33, 95% CI: 1.20–1.48) and major depressive disorder (MDD) (odds ratio 1.29, 95% CI: 1.04–1.60). However, studies that used sibling control groups showed no significant association between early-life exposure and ASD or ADHD. No studies in MDD used sibling controls. Using the GRADE certainty assessment, all meta-analyses but one were rated very low certainty, largely owing to methodological and statistical heterogeneity.
Conclusions
While there was weak evidence for associations between antibiotic use in early-life and later neurodevelopmental outcomes, these were attenuated in sibling-controlled subgroup analyses. Thus, associations may be explained by genetic and familial confounding, and studies failing to utilise sibling-control groups must be interpreted with caution. PROSPERO ID: CRD42022304128
The incidence of Kawasaki Disease has a peak in the winter months with a trough in late summer/early fall. Environmental/exposure factors have been associated with a time-varying incidence. These factors were altered during the COVID-19 pandemic. The study was performed through the International Kawasaki Disease Registry. Data from patients diagnosed with acute Kawasaki Disease and Multiple Inflammatory Syndrome-Children were obtained. Guideline case definitions were used to confirm site diagnosis. Enrollment was from 1/2020 to 7/2023. The number of patients was plotted over time. The patients/month were tabulated for the anticipated peak Kawasaki Disease season (December–April) and non-peak season (May–November). Data were available for 1975 patients from 11 large North American sites with verified complete data and uninterrupted site reporting. The diagnosis criteria were met for 531 Kawasaki Disease and 907 Multiple Inflammatory Syndrome-Children patients. For Multiple Inflammatory Syndrome-Children there were peaks in January of 2021 and 2022. For Kawasaki Disease, 2020 began (January–March) with a seasonal peak (peak 26, mean 21) with a subsequent fall in the number of cases/month (mean 11). After the onset of the pandemic (April 2020), there was no clear seasonal Kawasaki Disease variation (December–April mean 12 cases/month and May–November mean 10 cases/month). During the pandemic, the prevalence of Kawasaki Disease decreased and the usual seasonality was abolished. This may represent the impact of pandemic public health measures in altering environmental/exposure aetiologic factors contributing to the incidence of Kawasaki Disease.
Judges are not the first political officials that come to mind when one considers the role of social media in modern politics. Following in the wake of some prominent judicial personalities adopting Twitter, however, a growing number of state high court judges have adopted and established more public personas on the platform. Judges use Twitter in substantively different ways than traditional elected officials (Curry and Fix 2019); however, little is understood about how the use of such social media platforms affects broader judicial networks. Recognizing that judges, like typical social media users, may aspire to expand their networks to build and appeal to broader audiences, we contend that active participation in judicial Twitterverse could yield personal and professional advantages. Here, we address a currently unexplored question: To what extent have judges formed a distinctive “judicial network,” on Twitter, and what discernible patterns present in these networks? Leveraging the unique structure of social media, we collect comprehensive network data on judging using Twitter and analyze what institutional and social factors impact greater power within the judicial network. We find that early adoption, electoral concerns, and connective links between judges all impact the strength of the judicial network, highlighting the complex motivations driving judicial Twitter engagement, and the significance of network building in judges’ social media strategies and its potential impact on career advancement.
To maximize its value, the design, development and implementation of structural health monitoring (SHM) should focus on its role in facilitating decision support. In this position paper, we offer perspectives on the synergy between SHM and decision-making. We propose a classification of SHM use cases aligning with various dimensions that are closely linked to the respective decision contexts. The types of decisions that have to be supported by the SHM system within these settings are discussed along with the corresponding challenges. We provide an overview of different classes of models that are required for integrating SHM in the decision-making process to support the operation and maintenance of structures and infrastructure systems. Fundamental decision-theoretic principles and state-of-the-art methods for optimizing maintenance and operational decision-making under uncertainty are briefly discussed. Finally, we offer a viewpoint on the appropriate course of action for quantifying, validating, and maximizing the added value generated by SHM. This work aspires to synthesize the different perspectives of the SHM, Prognostic Health Management, and reliability communities, and provide directions to researchers and practitioners working towards more pervasive monitoring-based decision-support.
Floatation-REST (restricted environmental stimulation therapy) has shown promising potential as a therapeutic intervention in psychiatric conditions such as anxiety and anorexia nervosa. We speculate that the sensory deprivation might act as a kind of interoceptive training. Within our lab, interoceptive trait prediction error has been used to predict states of anxiety in autistic adults. There is also emerging research conceptualising interoceptive mismatches potentially playing a role in fatigue. Our aim was to run a feasibility study assessing the tolerability of Floatation-REST for participants with disabling fatigue. We also aimed to establish the feasibility of gathering data on mechanistic measures, such as heart rate variability (HRV) and interoception, during floatation.
Methods
Participants were recruited via online advertisements and were screened to check they scored at least 36 on the Fatigue Severity Scale (FSS). Pertinent medication changes and previous float experience within the last 6 weeks were amongst the exclusion criteria. Baseline measures included: Modified Fatigue Impact Scale (MFIS); Body Perception Questionaire; hypermobility questionnaire and Tellegen Absorption Scale. Participants completed four 90 minute sessions of floatation-REST across a 2–6 week period with 1 week of ecological momentary sampling (EMS) before and after. Immediate pre and post float measures included testing interoceptive sensibility, accuracy and awareness. HRV was measured during floatation. Change in energy was measured by retrospective subjective assessment, changes in validated fatigue scales and EMS.
Results
Baseline MFIS scores (median = 67.5; range = 55–77) indicated a high degree of severity of participant fatigue. 15 participants were recruited to the study. 13 participants started the float intervention and 11 completed all four sessions. No drop out was due to poor tolerability. Most adverse events were mild, expected and related to the pre/post float testing. HRV data was successfully captured throughout all sessions. Participant surveys described improvements in energy levels, sleep and relaxation and 73% “strongly agreed” to an overall positive effect. Furthermore, both statistically and clinically significant reductions were noted in the mean FSS scores (56.9 to 52.6; p = 0.044) and the MFIS scores (67.0 to 56.4; p = 0.003). Detailed energy assessment was obtained by EMS with 37 to 86 data points per participant.
Conclusion
Floatation-REST appears to be a feasible intervention for people with severe fatigue. EMS, HRV data, interoceptive data and other measures were reliably recorded. Reported subjective benefits were supported by an improvement in objective fatigue scores, though the lack of a control group makes these improvements speculative at present.
The long-standing assumption that aboveground plant litter inputs have a substantial influence on soil organic carbon storage (SOC) and dynamics has been challenged by a new paradigm for SOC formation and persistence. We tested the importance of plant litter chemistry on SOC storage, distribution, composition, and age by comparing two highly contrasting ecosystems: an old-growth coast redwood (Sequoia sempervirens) forest, with highly aromatic litter, and an adjacent coastal prairie, with more easily decomposed litter. We hypothesized that if plant litter chemistry was the primary driver, redwood would store more and older SOC that was less microbially processed than prairie. Total soil carbon stocks to 110 cm depth were higher in prairie (35 kg C m−2) than redwood (28 kg C m−2). Radiocarbon values indicated shorter SOC residence times in redwood than prairie throughout the profile. Higher amounts of pyrogenic carbon and a higher degree of microbial processing of SOC appear to be instrumental for soil carbon storage and persistence in prairie, while differences in fine-root carbon inputs likely contribute to younger SOC in redwood. We conclude that at these sites fire residues, root inputs, and soil properties influence soil carbon dynamics to a greater degree than the properties of aboveground litter.
Research has shown that framing messages in terms of benefits or detriments can have a substantial influence on intended behavior. For prevention behaviors, positively framed messages have been found to elicit stronger behavioral intentions than negatively framed messages. Research also seems to indicate that certain contextual features contribute to the persuasiveness of a message. In the present research we test how message framing, contextually presented affect and the number of argument factors interact and contribute to the persuasiveness of a health related message. Consistent with our hypothesis, we found that, in our prevention focused task, increasing the number of arguments increased behavioral intentions (BI) for positively framed messages when subjects were cued, via negative affect, to be attentive to the message. This resulted in a significant framing effect for messages with the maximum number of arguments and a negative background picture. An account of contextual influence in persuasive health messages is discussed.
To examine the impact of SARS-CoV-2 infection on CLABSI rate and characterize the patients who developed a CLABSI. We also examined the impact of a CLABSI-reduction quality-improvement project in patients with and without COVID-19.
Design:
Retrospective cohort analysis.
Setting:
Academic 889-bed tertiary-care teaching hospital in urban Los Angeles.
Patients or participants:
Inpatients 18 years and older with CLABSI as defined by the National Healthcare Safety Network (NHSN).
Intervention(s):
CLABSI rate and patient characteristics were analyzed for 2 cohorts during the pandemic era (March 2020–August 2021): COVID-19 CLABSI patients and non–COVID-19 CLABSI patients, based on diagnosis of COVID-19 during admission. Secondary analyses were non–COVID-19 CLABSI rate versus a historical control period (2019), ICU CLABSI rate in COVID-19 versus non–COVID-19 patients, and CLABSI rates before and after a quality- improvement initiative.
Results:
The rate of COVID-19 CLABSI was significantly higher than non–COVID-19 CLABSI. We did not detect a difference between the non–COVID-19 CLABSI rate and the historical control. COVID-19 CLABSIs occurred predominantly in the ICU, and the ICU COVID-19 CLABSI rate was significantly higher than the ICU non–COVID-19 CLABSI rate. A hospital-wide quality-improvement initiative reduced the rate of non–COVID-19 CLABSI but not COVID-19 CLABSI.
Conclusions:
Patients hospitalized for COVID-19 have a significantly higher CLABSI rate, particularly in the ICU setting. Reasons for this increase are likely multifactorial, including both patient-specific and process-related issues. Focused quality-improvement efforts were effective in reducing CLABSI rates in non–COVID-19 patients but were less effective in COVID-19 patients.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
We use attribution theory to show that firms that make more internal attributions to positive performance outcomes engage slack resources more freely for corporate entrepreneurship (CE) than firms that make fewer of such attributions. In addition, we show that the way in which companies make external attributions to performance outcomes moderates this relationship. To examine this empirically, we explore how top management teams discuss the factors that contribute to firm performance. Specifically, we look at attributions provided in the Management's Discussion and Analysis section of the annual reports of 144 pharmaceutical firms over a 2-year period. In line with our predictions, we find that greater internal attribution to positive performance outcomes leads to increased use of slack resources for CE. Furthermore, we find that this effect is stronger when firms make more external attributions to negative performance outcomes than positive performance outcomes.
Given the relatively small industry scale of cow-calf operations in New York to other regions of the country, little is known about differences in determinant values for feeder cattle. Using auction prices and quality characteristics over 7 years, differences in market, lot, and quality parameters suggest opportunities for improved marketing performance. A delta profit model is constructed to inform timing of marketing decisions for producers. The results indicate a relatively high potential for producers to increase farm returns by delaying sales of lighter-weight feeder cattle from the fall to spring auction months, given sufficient rates of gain and reasonable overwintering costs.
The severe acute respiratory syndrome coronavirus disease-2 (SARS-CoV-2) pandemic of 2020-2021 created unprecedented challenges for clinicians in critical care transport (CCT). These CCT services had to rapidly adjust their clinical approaches to evolving patient demographics, a preponderance of respiratory failure, and transport utilization stratagem. Organizations had to develop and implement new protocols and guidelines in rapid succession, often without the education and training that would have been involved pre-coronavirus disease 2019 (COVID-19). These changes were complicated by the need to protect crew members as well as to optimize patient care. Clinical initiatives included developing an awake proning transport protocol and a protocol to transport intubated proned patients. One service developed a protocol for helmet ventilation to minimize aerosolization risks for patients on noninvasive positive pressure ventilation (NIPPV). While these clinical protocols were developed specifically for COVID-19, the growth in practice will enhance the care of patients with other causes of respiratory failure. Additionally, these processes will apply to future respiratory epidemics and pandemics.
Interviewing patients is one of the most rewarding aspects of clinical psychiatry. It offers an opportunity to get to know someone, to find clues to diagnosis, and to relieve suffering. The psychiatric interview thus functions as an alliance-building process, diagnostic procedure, and therapeutic intervention. While this may sound complex, the interview process can be simplified by learning to approach it with the proper attitude. This can be considered analogous to helping a young musician learn how to have proper posture at the piano or to hold a violin and bow correctly. Without a good feel for the instrument, and without the appropriate perspective for learning what the music is about, the simple drilling of scales and fingerings will be misguided. Similarly, in the psychiatric interview, one must have a proper attitude toward the patient to be of the most help. The key qualities of this approach are curiosity, respect, and caring. If you notice obstacles to feeling interested in or caring about the patient, do not despair – such attitudes can be cultivated (see the section on empathy and compassion later in this chapter).
The purpose of this study was to compare statistical knowledge of health science faculty across accredited schools of dentistry, medicine, nursing, pharmacy, and public health.
Methods:
A probability sample of schools was selected, and all faculty at each selected school were invited to participate in an online statistical knowledge assessment that covered fundamental topics including randomization, study design, statistical power, confidence intervals, multiple testing, standard error, regression outcome, and odds ratio.
Results:
A total of 708 faculty from 102 schools participated. The overall response rate was 6.5%. Most (94.2%) faculty reported reading the peer-reviewed health-related literature. Respondents answered 66.2% of questions correctly across all questions and disciplines. Public health had the highest performance (80.7%) and dentistry the lowest (53.3%).
Conclusions:
Knowledge of statistics is essential for critically evaluating evidence and understanding the health literature. These study results identify a gap in knowledge by educators tasked with training the next generation of health science professionals. Recommendations for addressing this gap are provided.
Approximately one in ten adults under the age of 65 in the USA has a mobility impairing disability. People with mobility impairment generally have poorer dietary habits contributing to obesity and related negative health outcomes. This article presents the psychometric properties of the Food Environment Assessment Survey Tool (FEAST) instrument that measures barriers to accessing healthy food from the perspective of people with mobility impairment (PMI).
Design:
The current study presents cross-sectional data from two sequential independent surveys.
Setting:
Surveys were administered online to a national sample of PMI.
Participants:
Participants represented PMI living throughout the USA. The pilot FEAST survey involved 681 participants and was used to shape the final instrument; 25 % completed a retest survey. After following empirically and theoretically guided item reduction strategies, the final FEAST instrument was administered to a separate sample of 304 PMI.
Results:
The final twenty-seven-item FEAST instrument includes items measuring Neighbourhood Environment, Home Environment, Personal Control and Access to Support (Having Help, Food Delivery Services, Parking/Transportation). The final four scales had acceptable intra-class correlations, indicating that the scales could be used as reliable measures of the hypothesised constructs in future studies.
Conclusions:
The FEAST instrument is the first of its kind developed to assess the food environment from the perspective of PMI themselves. Future studies would benefit from using this measure in research and practice to help guide the development of policy aimed at improving access to healthy food and promoting healthy eating in community-dwelling PMI.
The global community needs to be aware of the potential psychosocial consequences that may be experienced by health care workers who are actively managing patients with coronavirus disease (COVID-19). These health care workers are at increased risk for experiencing mood and trauma-related disorders, including posttraumatic stress disorder (PTSD). In this concept article, strategies are recommended for individual health care workers and hospital leadership to aid in mitigating the risk of PTSD, as well as to build resilience in light of a potential second surge of COVID-19.
Barium titanate (BTO) is a ferroelectric perovskite with potential in energy storage applications. Previous research suggests that BTO dielectric constant increases as nanoparticle diameter decreases. This report recounts an investigation of this relationship. Injection-molded nanocomposites of 5 vol% BTO nanoparticles incorporated in a low-density polyethylene matrix were fabricated and measured. Finite-element analysis was used to model nanocomposites of all BTO sizes and the results were compared with experimental data. Both indicated a negligible relationship between BTO diameter and dielectric constant at 5 vol%. However, a path for fabricating and testing composites of 30 vol% and higher is presented here.
Downy brome, feral rye, and jointed goatgrass are problematic winter annual grasses in central Great Plains winter wheat production. Integrated control strategies are needed to manage winter annual grasses and reduce selection pressure exerted on these weed populations by the limited herbicide options currently available. Harvest weed-seed control (HWSC) methods aim to remove or destroy weed seeds, thereby reducing seed-bank enrichment at crop harvest. An added advantage is the potential to reduce herbicide-resistant weed seeds that are more likely to be present at harvest, thereby providing a nonchemical resistance-management strategy. Our objective was to assess the potential for HWSC of winter annual grass weeds in winter wheat by measuring seed retention at harvest and destruction percentage in an impact mill. During 2015 and 2016, 40 wheat fields in eastern Colorado were sampled. Seed retention was quantified and compared per weed species by counting seed retained above the harvested fraction of the wheat upper canopy (15 cm and above), seed retained below 15 cm, and shattered seed on the soil surface at wheat harvest. A stand-mounted impact mill device was used to determine the percent seed destruction of grass weed species in processed wheat chaff. Averaged across both years, seed retention (±SE) was 75% ± 2.9%, 90% ± 1.7%, and 76% ± 4.3% for downy brome, feral rye, and jointed goatgrass, respectively. Seed retention was most variable for downy brome, because 59% of the samples had at least 75% seed retention, whereas the proportions for feral rye and jointed goatgrass samples with at least 75% seed retention were 93% and 70%, respectively. Weed seed destruction percentages were at least 98% for all three species. These results suggest HWSC could be implemented as an integrated strategy for winter annual grass management in central Great Plains winter wheat cropping systems.