We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
North Carolina growers have long struggled to control Italian ryegrass, and recent research has confirmed Italian ryegrass biotypes resistant to nicosulfuron, glyphosate, clethodim, and paraquat. Integrating alternative management strategies is crucial to effectively control such biotypes. The objectives of this study were to evaluate Italian ryegrass control with cover crops and fall-applied residual herbicides and investigate cover crop injury from residual herbicides. This study was conducted during the fall/winter of 2021-22 in Salisbury and fall/winter of 2021-22 and 2022-23 at Clayton, NC. The study was designed as a 3x5 split-plot, where the main plot consisted of three cover crop treatments (no-cover, cereal rye at 80 kg ha-1, and crimson clover at 18 kg ha-1), and the subplots consisted of five residual herbicide treatments (S-metolachlor, flumioxazin, metribuzin, pyroxasulfone, and nontreated). In the 2021-22 season at Clayton, metribuzin injured cereal rye and crimson clover 65% and 55%, respectively. However, metribuzin injured both cover crops ≤6% in 2022-23. Flumioxazin resulted in unacceptable crimson clover injury with 50% and 38% in 2021-22 and 2022-23 in Clayton and 40% at Salisbury, respectively. Without preemergence herbicides, cereal rye controlled Italian ryegrass 85% and 61% at 24 WAP in 2021-22 and 2022-23 at Clayton and 82% in Salisbury, respectively. In 2021-22, Italian ryegrass seed production was lowest in cereal rye treatments at both locations, except when cover crop was treated with metribuzin. For example, in Salisbury, cereal rye plus metribuzin resulted in 39324 seeds m–2, compared to ≤4386 seeds m–2 from all other cereal rye treatments. In 2022-23, Italian ryegrass seed production in cereal rye was lower when either metribuzin or pyroxasulfone were used PRE (2670 and 1299 seeds m–2, respectively) when compared to cereal rye without herbicides (5600 seeds m–2).
Two studies were conducted in 2022 and 2023 near Rocky Mount and Clayton, NC, to determine the optimal granular ammonium sulfate (AMS) rate and application timing for pyroxasulfone-coated AMS. In the rate study, AMS rates included 161, 214, 267, 321, 374, 428, and 481 kg ha−1, equivalent to 34, 45, 56, 67, 79, 90, and 101 kg N ha−1, respectively. All rates were coated with pyroxasulfone at 118 g ai ha−1 and topdressed onto 5- to 7-leaf cotton. In the timing study, pyroxasulfone (118 g ai ha−1) was coated on AMS and topdressed at 321 kg ha−1 (67 kg N ha−1) onto 5- to 7-leaf, 9- to 11-leaf, and first bloom cotton. In both studies, weed control and cotton tolerance to pyroxasulfone-coated AMS were compared to pyroxasulfone applied POST and POST-directed. The check in both studies received non-herbicide-treated AMS (321 kg ha−1). Before treatment applications, all plots (including the check) were maintained weed-free with glyphosate and glufosinate. In both studies, pyroxasulfone applied POST was most injurious (8% to 16%), while pyroxasulfone-coated AMS resulted in ≤4% injury. Additionally, no differences in cotton lint yield were observed in either study. With the exception of the lowest rate of AMS (161 kg ha−1; 79%), all AMS rates coated with pyroxasulfone controlled Palmer amaranth ≥83%, comparably to pyroxasulfone applied POST (92%) and POST-directed (89%). In the timing study, the application method did not affect Palmer amaranth control; however, applications made at the mid- and late timings outperformed early applications. These results indicate that pyroxasulfone-coated AMS can control Palmer amaranth comparably to pyroxasulfone applied POST and POST-directed, with minimal risk of cotton injury. However, the application timing could warrant additional treatment to achieve adequate late-season weed control.
An experiment was conducted in 2022 and 2023 near Rocky Mount and Clayton, NC, to evaluate residual herbicide-coated fertilizer for cotton tolerance and Palmer amaranth control. Treatments included acetochlor, atrazine, dimethenamid-P, diuron, flumioxazin, fluometuron, fluridone, fomesafen, linuron, metribuzin, pendimethalin, pyroxasulfone, pyroxasulfone + carfentrazone, S-metolachlor, and sulfentrazone. Each herbicide was individually coated on granular ammonium sulfate (AMS) and top-dressed at 321 kg ha−1 (67 kg N ha−1) onto 5- to 7-leaf cotton. The check plots received the equivalent rate of nonherbicide-treated AMS. Before top-dress, all plots (including the check) were treated with glyphosate and glufosinate to control previously emerged weeds. All herbicides except metribuzin resulted in transient cotton injury. Cotton response to metribuzin varied by year and location. In 2022, metribuzin caused 11% to 39% and 8% to 17% injury at the Clayton and Rocky Mount locations, respectively. In 2023, metribuzin caused 13% to 32% injury at Clayton and 73% to 84% injury at Rocky Mount. Pyroxasulfone (91%), pyroxasulfone + carfentrazone (89%), fomesafen (87%), fluridone (86%), flumioxazin (86%), and atrazine (85%) controlled Palmer amaranth ≥85%. Pendimethalin and fluometuron were the least effective treatments, resulting in 58% and 62% control, respectively. As anticipated, early season metribuzin injury translated into yield loss; plots treated with metribuzin yielded 640 kg ha−1 and were comparable to yields after linuron (790 kg ha−1) was used. These findings suggest that with the exception of metribuzin, residual herbicides coated onto AMS may be suitable and effective in cotton production, providing growers with additional modes of action for late-season control of multiple herbicide–resistant Palmer amaranth.
An important contributor to the decreased life expectancy of individuals with schizophrenia is sudden cardiac death. Arrhythmic disorders may play an important role herein, but the nature of the relationship between schizophrenia and arrhythmia is unclear.
Aims
To assess shared genetic liability and potential causal effects between schizophrenia and arrhythmic disorders and electrocardiogram (ECG) traits.
Method
We leveraged summary-level data of large-scale genome-wide association studies of schizophrenia (53 386 cases, 77 258 controls), arrhythmic disorders (atrial fibrillation, 55 114 cases, 482 295 controls; Brugada syndrome, 2820 cases, 10 001 controls) and ECG traits (heart rate (variability), PR interval, QT interval, JT interval and QRS duration, n = 46 952–293 051). We examined shared genetic liability by assessing global and local genetic correlations and conducting functional annotation. Bidirectional causal relations between schizophrenia and arrhythmic disorders and ECG traits were explored using Mendelian randomisation.
Results
There was no evidence for global genetic correlation, except between schizophrenia and Brugada syndrome (rg = 0.14, 95% CIs = 0.06–0.22, P = 4.0E−04). In contrast, strong positive and negative local correlations between schizophrenia and all cardiac traits were found across the genome. In the most strongly associated regions, genes related to immune and viral response mechanisms were overrepresented. Mendelian randomisation indicated that liability to schizophrenia causally increases Brugada syndrome risk (beta = 0.14, CIs = 0.03–0.25, P = 0.009) and heart rate during activity (beta = 0.25, CIs = 0.05–0.45, P = 0.015).
Conclusions
Despite little evidence for global genetic correlation, specific genomic regions and biological pathways emerged that are important for both schizophrenia and arrhythmia. The putative causal effect of liability to schizophrenia on Brugada syndrome warrants increased cardiac monitoring and early medical intervention in people with schizophrenia.
Using a large, geographically diverse, hospital-based database in the United States (Premier PINC AI Healthcare Database), we aimed to describe the proportion and characteristics of patients receiving phenotype-desirable antimicrobial therapy (PDAT) among those hospitalized with Enterobacterales bloodstream infections.
Methods:
Adult patients with an admission between January 1, 2017 and June 30, 2022 with ≥1 blood culture positive for Escherichia coli, Klebsiella oxytoca, Klebsiella pneumoniae, or Proteus mirabilis and receiving an empiric antibiotic therapy on blood culture collection (BCC) Days 0 or 1 were included. Receiving PDAT (defined as receipt of any antimicrobial categorized as “desirable” for the respective phenotype) on BCC Days 0−2 was defined as receiving early PDAT.
Results:
Among 35,880 eligible patients, the proportion of patients receiving PDAT increased (from 6.8% to 22.8%) from BCC Day 0−4. Patients who received PDAT (8,193, 22.8%) were more likely to visit large (500 + beds, 36% vs 31%), teaching (45% vs 39%), and urban (85% vs 82%) hospitals in the Northeast (22% vs 13%) compared to patients not receiving PDAT (all P <. 01). Among patients receiving PDAT, 61.4% (n = 5,033) received it early; they had a lower mean comorbidity score (3.2 vs 3.6), were less likely to have severe or extreme severity of illness (71% vs 79%), and were less likely to have a pathogen susceptible to narrow-spectrum β-lactams (31% vs 71%) compared to patients in the delayed PDAT group (all P < .01).
Conclusions:
The proportion of patients receiving desirable therapy increased between BCC Day 0 and 4. Receipts of PDAT and early PDAT were associated with hospital, clinical, and pathogen characteristics.
Zolpidem is a nonbenzodiazepine, which acts as a sedative- hypnotic that binds to GABA (A) receptors at the same location as benzodiazepines and increases GABA effects in the central nervous system (Kovacic et al. Oxidative medicine and cellular longevity 2009, 2(1), 52–57). Literature shows that behavioral changes including amnesia, hallucinations, and other neurocognitive effects are some of the known side effects (Edinoff et al. Health psychology research 2021, 9(1), 24927). We present a case about Ms. A, a female in her sixties with a history of major depressive disorder with psychotic symptoms who was brought into the hospital by the EMS under police custody after stabbing her granddaughter with a knife. During the evaluation she was dissociating with impaired memory of the circumstances of her presentation. Collateral information about Ms. A revealed that she had no history of being violent, or any history of psychoactive substance use. Ms. A’s home psychiatric medications consisted of Sertraline 100mg, Bupropion 150 mg, Zolpidem 5mg.
Objectives
To better understand the potential risks with prescribing zolpidem in patient with insomnia.
Methods
In depth literature review about zolpidem. In addition, observation of Ms. A in the emergency with a full medical workup including but not limited to urine drug screen, brain imaging, lumbar puncture, etc.
Results
Ms.A medical workup was positive for a urinalysis revealing asymptomatic bacteriuria and she was treated empirically with cefdinir. Her medication regimen consisted of Bupropion 150 mg and Sertraline 100m, both daily. Zolpidem was discontinued and changed to Clonazepam 0.5mg for insomnia. She was also started on Olanzapine 5mg in the AM and 10mg in the PM. Her mental status was noted to have improved after discontinuation of Zolpidem. Patient received one dose in the hospital but after two days since discontinuation her mental status improved. Upon literature review previous reports have been published citing cases of patients on Zolpidem physically acting out while sleeping in a parasomnia-like behavior, with no recollection of memories upon awakening. (Inagaki et al. Primary care companion to the Journal of clinical psychiatry 2010, 12(6)). There are case reports of Zolpidem associated homicide (Paradis et al.The primary care companion for CNS disorders 2012, 14(4).
Conclusions
One limitation of our study is the patient was noted to have a sudden change in behavior with altered mental status which may be attributed to an underlying asymptomatic bacteriuria. It should be noted that this may have been an incidental finding. This does not exclude the possibility of Zolpidem as the primary cause of the change of her altered mental status or further exacerbating the change in her mental status. Though Zolpidem can be therapeutic and safe, we as clinicians have to be aware of the potential side effects of Zolpidem when prescribing medications.
Prior studies evaluating the impact of discontinuation of contact precautions (DcCP) on methicillin-resistant Staphylococcus aureus (MRSA) outcomes have characterized all healthcare-associated infections (HAIs) rather than those likely preventable by contact precautions. We aimed to analyze the impact of DcCP on the rate of MRSA HAI including transmission events identified through whole genome sequencing (WGS) surveillance.
Design:
Quasi experimental interrupted time series.
Setting:
Acute care medical center.
Participants:
Inpatients.
Methods:
The effect of DcCP (use of gowns and gloves) for encounters among patients with MRSA carriage was evaluated using time series analysis of MRSA HAI rates from January 2019 through December 2022, compared to WGS-defined attributable transmission events before and after DcCP in December 2020.
Results:
The MRSA HAI rate was 4.22/10,000 patient days before and 2.98/10,000 patient days after DcCP (incidence rate ratio [IRR] 0.71 [95% confidence interval 0.56–0.89]) with a significant immediate decrease (P = .001). There were 7 WGS-defined attributable transmission events before and 11 events after DcCP (incident rate ratio 0.90 [95% confidence interval 0.30–2.55]).
Conclusions:
DcCP did not result in an increase in MRSA HAI or, in WGS-defined attributable transmission events. Comprehensive analyses of the effect of transmission prevention measures should include outcomes specifically measuring transmission-associated HAI.
The modern marine megafauna is known to play important ecological roles and includes many charismatic species that have drawn the attention of both the scientific community and the public. However, the extinct marine megafauna has never been assessed as a whole, nor has it been defined in deep time. Here, we review the literature to define and list the species that constitute the extinct marine megafauna, and to explore biological and ecological patterns throughout the Phanerozoic. We propose a size cut-off of 1 m of length to define the extinct marine megafauna. Based on this definition, we list 706 taxa belonging to eight main groups. We found that the extinct marine megafauna was conspicuous over the Phanerozoic and ubiquitous across all geological eras and periods, with the Mesozoic, especially the Cretaceous, having the greatest number of taxa. Marine reptiles include the largest size recorded (21 m; Shonisaurus sikanniensis) and contain the highest number of extinct marine megafaunal taxa. This contrasts with today’s assemblage, where marine animals achieve sizes of >30 m. The extinct marine megafaunal taxa were found to be well-represented in the Paleobiology Database, but not better sampled than their smaller counterparts. Among the extinct marine megafauna, there appears to be an overall increase in body size through time. Most extinct megafaunal taxa were inferred to be macropredators preferentially living in coastal environments. Across the Phanerozoic, megafaunal species had similar extinction risks as smaller species, in stark contrast to modern oceans where the large species are most affected by human perturbations. Our work represents a first step towards a better understanding of the marine megafauna that lived in the geological past. However, more work is required to expand our list of taxa and their traits so that we can obtain a more complete picture of their ecology and evolution.
Seismic imaging in 3-D holds great potential for improving our understanding of ice sheet structure and dynamics. Conducting 3-D imaging in remote areas is simplified by using lightweight and logistically straightforward sources. We report results from controlled seismic source tests carried out near the West Antarctic Ice Sheet Divide investigating the characteristics of two types of surface seismic sources, Poulter shots and detonating cord, for use in both 2-D and 3-D seismic surveys on glaciers. Both source types produced strong basal P-wave and S-wave reflections and multiples recorded in three components. The Poulter shots had a higher amplitude for low frequencies (<10 Hz) and comparable amplitude at high frequencies (>50 Hz) relative to the detonating cord. Amplitudes, frequencies, speed of source set-up, and cost all suggested Poulter shots to be the preferred surface source compared to detonating cord for future 2-D and 3-D seismic surveys on glaciers.
In decision making regarding the management of vestibular schwannomas, alongside clinical outcomes, an understanding of patient reported health-related quality of life measures is key. Therefore, the aim of this research is to compare health-related quality of life in vestibular schwannoma patients treated with active observation, stereotactic radiotherapy and microsurgical excision.
Methods
A cross-sectional study of patients diagnosed with unilateral sporadic vestibular schwannomas between 1995 and 2015 at a specialist tertiary centre was conducted. Patients completed the Penn Acoustic Neuroma Quality of Life questionnaire and handicap inventories for dizziness, hearing and tinnitus.
Results
Of 234 patients, 136 responded (58.1 per cent). Management modality was: 86 observation, 23 stereotactic radiotherapy and 25 microsurgery. Females reported significantly worse dizziness; males reported significantly worse physical disability. Patients less than 65 years old reported significantly worse tinnitus and pain scores. Overall, quality of life was higher in the observation group.
Conclusion
Conservative management, where appropriate, is favourable with higher quality-of-life outcomes in this cohort. This must be weighed against the risks of a growing tumour.
The objective of this study was to determine factors associated with testing positive for SARS-CoV-2 among healthcare personnel. Secondary objectives were to assess representativeness of recruited participants and the effectiveness of a multiple-contact protocol for recruiting healthcare personnel in this COVID-19 study.
Design:
Survey study, conducted as part of an observational test-negative study of COVID-19 vaccine effectiveness.
Setting:
University of Utah Health system, including both inpatient and outpatient facilities.
Participants:
Clinical and non-clinical healthcare personnel at University of Utah Health. 1456 were contacted and 503 (34.5%) completed the survey. Cases were all eligible employees testing positive for COVID-19, with 3:1 randomly selected, matched controls (test negative) selected weekly.
Methods:
Online survey.
Results:
Significant differences in the demographics of participants and the source population were observed; e.g., nursing staff comprised 31.6% of participants but only 23.3% of the source population. The multiple-contact recruitment protocol increased participation by ten percentage points and ensured equal representation of controls. Potential exposure to illness outside of work was strongly predictive of testing positive for SARS-CoV-2 (OR = 3.74; 95% CI: 2.29, 6.11) whereas potential exposure at work was protective against testing positive (OR: 0.51, 95% CI: 0.29, 0.88).
Conclusions:
Carefully designed recruitment protocols increase participation and representation of controls, but bias in participant demographics still exists. The negative association between potential workplace exposure and positive test suggests testing bias in the test-negative design. Healthcare personnel’s potential exposures to COVID-19 outside of the workplace are important predictors of SARS-CoV-2 seropositivity.
Diagenetic illite growth in porous sandstones leads to significant modifications of the initial pore system which result in tight reservoirs. Understanding and quantifying these changes provides insight into the porosity-permeability history of the reservoir and improves predictions on petrophysical behavior. To characterize the various stages of diagenetic alteration, a focused ion beam – scanning electron microscopy (FIB-SEM) study was undertaken on aeolian sandstones from the Bebertal outcrop of the Parchim Formation (Early Permian Upper Rotliegend group). Based on 3D microscopic reconstructions, three different textural types of illite crystals occur, common to many tight Rotliegend sandstones, namely (1) feldspar grain alterations and associated illite meshworks, (2) tangential grain coats, and (3) pore-filling laths and fibers. Reaction textures, pore structure quantifications, and numerical simulations of fluid transport have revealed that different generations of nano-porosity are connected to the diagenetic alteration of feldspars and the authigenic growth of pore-filling illites. The latter leads to the formation of microstructures that range from authigenic compact tangential grain coatings to highly porous, pore-filling structures. K-feldspar replacement and initial grain coatings of illite are composed primarily of disordered 1Md illite whereas the epitaxially grown illite lath- and fiber-shaped crystals occurring as pore-filling structures are of the trans-vacant 1Mtv polytype. Although all analyzed 3D structures offer connected pathways, the largest reduction in sandstone permeability occurred during the initial formation of the tangential illite coatings that sealed altered feldspars and the subsequent growth of pore-filling laths and fibrous illites. Analyses of both illite pore-size and crystallite-size distributions indicate that crystal growth occurred by a continuous nucleation and growth mechanism probably controlled by the multiple influx of potassium-rich fluids during late Triassic and Jurassic times. The detailed insight into the textural varieties of illite crystal growth and its calculated permeabilities provides important constraints for understanding the complexities of fluid-flow in tight reservoir sandstones.
In the United States, Black individuals have suffered from 300 years of racism, bias, segregation and have been systematically and intentionally denied opportunities to accrue wealth. These disadvantages have resulted in disparities in health outcomes. Over the last decade there has been a growing interest in examining social determinants of health as upstream factors that lead to downstream health disparities. It is of vital importance to quantify the contribution of SDH factors to racial disparities in order to inform policy and social justice initiatives. This demonstration project uses years of education and white matter hyperintensities (WMH) to illustrate two methods of quantifying the role of a SDH in producing health disparities.
Participants and Methods:
The current study is a secondary data analysis of baseline data from a subset of the National Alzheimer's Coordinating Center database with neuroimaging data collected from 2002-2019. Participants were 997 cognitively diverse, Black and White (10.4% Black) individuals, aged 60-94 (mean=73.86, 56.5% female), mean education of 15.18 years (range= 0-23, SD=3.55). First, mediation, was conducted in the SEM framework using the R package lavaan. Black/White race was the independent variable, education was the mediator, WMH volume was the dependent variable, and age/sex were the covariates. Bootstrapped standard errors were calculated using 1000 iterations. The indirect effect was then divided by the total effect to determine the proportion of the total effect attributable to education. Second, a population attributable fraction (PAF) or the expected reduction in WMH if we eliminated low education and structural racism for which Black serves as a proxy was calculated. Two logistic regressions with dichotomous (median split) WMH as the dependent variable, first with low (less than high school) versus high education, and second with Black/White race added as predictors. Age/sex were covariates. PAF of education, and then of Black/White race controlling for education were obtained. Subsequently, a combined PAF was calculated.
Results:
In the lavaan model, the total effect of Black/White race on WMH was not significant (B=.040, se=.113, p=.246); however, Black/White race significantly predicted education (B= -.108, se=.390, p=.001) and education significantly predicted WMH burden (B=-.084, se=.008, p=.002). This resulted in a significant indirect effect (effect=.009, se=.014, p=.032). 22.6 % of the relationship between Black/White race and WMH was mediated by education. In the logistic models, the PAF of education was 5.3% and the additional PAF of Black/White race was 2.7%. The combined PAF of Black race and low education was 7.8%.
Conclusions:
From our mediation we can conclude that 22.6% of the relationship between Black/White race and WMH volume is explained by education. Our PAF analysis suggests that we could reduce 7.8% of the cases with high WMH burden if we eliminated low education and the structural racism for which Black serves as a proxy. This is an under estimation of the role that education and structural racism play in WMH burden due to our positively selected sample and crude measure of education. However, these methods can help researchers quantify the contribution of SDH to disparities in older adulthood and provide targets for policy change.
This article provides an analysis of the potential danger to a president’s policy agenda that comes from appointing a sitting elected official to the cabinet. We present historical data on cabinet secretaries since the founding and demonstrate that concerns about seats falling to the other party following the appointment of an elected official to the cabinet date back at least to Martin Van Buren’s establishment of the first American mass political party in 1828. We then focus on the post-Seventeenth Amendment cabinet and show that almost 30 percent of cabinet secretaries in this era who were elected officials at the time of their appointment left seats that flipped to the other party by the next regular general election. We conclude by discussing how our results compare with Alexander Hamilton, Martin Van Buren, and Woodrow Wilson’s differing views on the cabinet and the implications for the president’s policy agenda.
Resident physicians compared to the general public are exposed to a more rigorous schedule. Burnout as described by the World Health Organization is a phenomenon occurring in an occupational setting. It consists of three domains: feelings of exhaustion, reduced professional efficacy, increased mental distance from one’s own job. Research shows that increased working hours are associated with higher levels of burnout in resident physicians.
Objectives
Through literature review we will explore whether this burnout contributes to an increased suicidal risk in the resident physician population.
Methods
Various studies assessing training of residents globally were analyzed and compared. A study in Japan distributed a survey to 4306 resident physicians. Suicidal ideation was noted in 5.6% of these physicians but when working more than 100 hours in the hospital the rate increased to 7.8%. In Australia it was found that once doctors in training worked more than 55 hours per week there was was an increase of 50% in suicidal ideation. It was also found that 12.3% of the people surveyed in the Australian study had reported suicidal ideation within the past 12 months of the survey. A study observing 5126 Dutch residents found 12% of residents having suicidal ideation but double in the group with burnout vs the group without burnout.
Results
The studies listed show that increased work hours and burnout was associated with increased suicidal ideation in medical residents. A study observing 1354 physicians in the US found that higher measurements of burnout were associated with suicidal ideation similar to previous studies. However once adjusted for depression, it was noted that there was an association with depression and suicidal ideation but not with burnout. Depression may be a confounding variable that may have not been adjusted for when determining the association of burnout with suicidal ideation. In addition further research looking at the leading cause of death among a total of 381,614 US medical residents between the years 2000 to 2014 found suicide as the second most common cause of death. It was however found when looking at resident physicians between the age of 25-34.9 there was 4.07 suicides per 100,000 person years while in the general public there was 13.07 suicides per 100,000 years.
Conclusions
The rate of suicide was found to be lower in resident physicians compared to the general public. Suicidal ideation may be more closely associated with depression versus burnout itself and should be accounted for when assessing suicidal ideation in the resident physician population. Suicide rates being lower in resident physicians compared to the general public bring up the possibility that burnout in resident physicians does not have to be directly correlated with increased risk of suicide.
This volume emerged from the notion that marketers and lawyers often talk about the same things. They may use different names, but essentially the things they talk about are the same. For example, while marketers talk about brands, lawyers talk about trademarks. However, relatively late in the process of editing this volume, we, as editors, had a somewhat unsettling realization. Throughout the planning and editing process for this book, we had been laboring under, not unrelated, but certainly not identical, views about the domain of marketing and the reach of law. We had no real common understanding of what marketing is, what marketing theory entails, and how the law shapes and governs marketing activities. Such a state of affairs is part of the inevitable risk of bringing together a group of scholars from two distinct disciplines. Fortunately, the realization helped us recognize that both marketing and law are sometimes vessels into which users can pour whatever content they wish. At the start, therefore, we thought it wise to dispense with some misconceptions and offer at least some working definitions of the terms and ideas we encounter in this volume.