We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Malnutrition from poor diet is a persistent issue in Sri Lanka, especially among women and children. High rates of undernutrition and micronutrient deficiencies are documented among rural poor communities(1). Household food production may enhance maternal and child nutrition directly by increasing access to diverse foods and indirectly by providing income to diversify diets(2). This study explores the cross-sectional relationship between household food production and individual dietary diversity among women aged 18-45 years and children aged 2-5 years in Batticaloa district, Sri Lanka. We randomly selected 450 low-income mother-child pairs receiving a Samurdhi subsidiary, having a home garden. Through face-to-face interview, we gathered information on the types of crops grown and livestock reared in the preceding 12 months. Production quantity and utilization were also detailed. Additionally, socio-demographic information and market access were obtained. To measure women’s dietary diversity (DD), we used a scale based on 10-food groups and a 7-food group scale for children. Women who consumed five or more food groups were defined as meeting the Minimum Dietary Diversity of Women (MDD-W), whereas children who consumed of four or more food groups met the minimum standards. Multiple linear regression and binary logistic regression were used to identify the factors predicting individual DD. Complete data for 411 pairs were analysed. The results showed, only 15.3% of the women met MDD-W, with a mean DDS of 3.3 (SD = 1.2). Children had a mean DDS of 3.3 (SD = 1.2), and 41.1% of them met the minimum diversity. Regression analysis indicated that growing leafy vegetables was positively associated with increased dietary diversity of women (β = 0.337; 95% CI: 0.13, 0.54; p = 0.001) and children (β = 0.234; 95% CI: 0.05, 0.42; p = 0.013) but not with meeting the minimum diversity. Moreover, monthly income above 35,000 LKR, higher education level, a secondary income source andfood security were also positively associated with women’s DD. Conversely, living further away from the main road reduced the women’s DD. Interestingly, livestock ownership was only associated with women meeting the MDD-W, but not for children. For children, monthly income was a strong predictor of DD and meeting minimum diversity. Surprisingly, living far from the market was associated with increased DD in children (β = 0.018; 95% CI: 0.01, 0.03; p = 0.013), while distance to main road had a similar effect as in women. Notably, selling their produce at the market contributed to meeting the minimum dietary diversity in children (β = 0.573; 95% CI: 0.14, 1.02; p = 0.013). These findings suggest that enhancing household food production could play a crucial role in improving dietary diversity and addressing malnutrition, particularly in rural Sri Lankan communities, and potentially in other similar settings.
Micronutrient malnutrition is a public health concern in many developing countries including Sri Lanka. Rural poor households are more vulnerable to micronutrient malnutrition due to their monotonous rice-based diet, which lacks dietary diversification(1). Despite the potential of home gardens on increased food access and diversity, their contribution to household dietary diversity remains unclear. This study aimed to investigate the impact of home gardens on diet diversity among rural Sri Lankan households. Low-income households with children under five were randomly selected from the Samurdhi beneficiary list, and 450 households having a home garden agreed to be interviewed. We collected information on types of crops and livestock produced over the past 12 months and their utilisation. We also collected the socio-demographic characteristics of the households. We measured household dietary diversity using the Household Dietary Diversity Score (HDDS) based on FAO guidelines. Multiple linear regression was used to identify the predictors of HDDS. Complete data sets were only available for 411 households and were included in the analysis. The HDDS ranged from 3 to 10 with a mean of 6.4 (±1.37 SD) indicating a moderate level of dietary diversity. However, only 20.4% of the households met the adequacy threshold, which is higher than the third quartile(2). Cereals, and fats and oils were the only food groups consumed by all the households. Although many households produced fruits (67.2%) and reared livestock (48.2%), the consumption of these groups were the lowest among the 12 food groups. Predictors of HDDS included monthly household income which had a strong positive relationship, especially earnings above 35,000 LKR (β = 1.02; S.E = 0.246; p = 0.000). Surprisingly, living far from the market was associated with increased HDDS (β = 0.026; S.E = 0.008; p = 0.004). Conversely, living further away from the main road reduced the HDDS (β = −0.133; S.E = 0.049; p = 0.007). Growing staples reduced the HDDS (β = −0.395; S.E = 0.174; p = 0.023), whereas growing leafy vegetables increased the diet diversity (β = 0.394; S.E = 0.154; p = 0.010). Selling homegrown products also increased HDDS (β = 0.276; S.E = 0.136; p = 0.043). However, other covariates such as the education level of the female adult, household food security status, home garden yield (kg), and livestock richness, which showed significant correlation in the bivariate analysis did not significant in the multiple regression analysis. Although all households in this district engage in some form of home gardening, 79.6% of households did not have adequate dietary diversity. There is a need to understand how home gardens can better contribute to dietary diversity.
Duchenne muscular dystrophy is a devastating neuromuscular disorder characterized by the loss of dystrophin, inevitably leading to cardiomyopathy. Despite publications on prophylaxis and treatment with cardiac medications to mitigate cardiomyopathy progression, gaps remain in the specifics of medication initiation and optimization.
Method:
This document is an expert opinion statement, addressing a critical gap in cardiac care for Duchenne muscular dystrophy. It provides thorough recommendations for the initiation and titration of cardiac medications based on disease progression and patient response. Recommendations are derived from the expertise of the Advance Cardiac Therapies Improving Outcomes Network and are informed by established guidelines from the American Heart Association, American College of Cardiology, and Duchenne Muscular Dystrophy Care Considerations. These expert-derived recommendations aim to navigate the complexities of Duchenne muscular dystrophy-related cardiac care.
Results:
Comprehensive recommendations for initiation, titration, and optimization of critical cardiac medications are provided to address Duchenne muscular dystrophy-associated cardiomyopathy.
Discussion:
The management of Duchenne muscular dystrophy requires a multidisciplinary approach. However, the diversity of healthcare providers involved in Duchenne muscular dystrophy can result in variations in cardiac care, complicating treatment standardization and patient outcomes. The aim of this report is to provide a roadmap for managing Duchenne muscular dystrophy-associated cardiomyopathy, by elucidating timing and dosage nuances crucial for optimal therapeutic efficacy, ultimately improving cardiac outcomes, and improving the quality of life for individuals with Duchenne muscular dystrophy.
Conclusion:
This document seeks to establish a standardized framework for cardiac care in Duchenne muscular dystrophy, aiming to improve cardiac prognosis.
We examine a variant of ultimatum bargaining in which principals may delegate their proposal decision to agents hired from a competitive market. Contrary to several prior studies, we find that when principals must use agents, the resulting proposals are significantly higher than when principals make proposals themselves. In reconciling our results with prior findings, we conclude that both the rejection power afforded to responders and the structure of principal-agent contracts can play significant roles in the nature of outcomes under delegated bargaining.
Patients with severe mental illnesses (SMI) are often exposed to polymedication. Additionally, the risk of somatic diseases is twice as high in patients with SMI as in individuals without a psychiatric disorder. Furthermore, drug–drug interactions (DDI) between psychiatric drugs and somatic medications are a well-known cause of adverse drug reactions (ADR).
Objectives
The aim of this study was to analyse whether already known DDI related to psychiatric drugs and somatic medication still occur in everyday clinical practice.
Methods
Therefore we identified all spontaneous ADR reports contained in the European ADR database EudraVigilance from Germany received between 01/2017 and 12/2021 reported for patients older than 17 years in which antidepressants, antipsychotics and mood stabilizers were reported as suspected/interacting (n= 9,665). ADR reports referring to intentional overdoses and suicide attempts were excluded (n= 9,276 left). We used the ABDATA drug information system in order to identify all potential DDI (pDDI). The identified reports with pDDI were then assessed individually to determine whether the respective DDI occurred.
Results
1,271 reports with 728 potentially interacting drugs pairs related to psychiatric drugs and somatic medications with 2,655 pDDI were found. Restricted to potentially interacting drug pairs with more than 10 reports, (i) hyponatremias related to antidepressants and diuretics (n= 362, 32.6%), (ii) bleeding events related to selective serotonin reuptake inhibitors (SSRI) and platelet aggregation inhibitors, anticoagulants or non-steroidal antiinflammatory drugs (NSAID) (n= 295, 17.5%), and (iii) increased beta-blocker effects related to SSRIs and beta-blockers (n= 126, 11.3%) were the most frequently identified pDDI. After individual case assessment, in 33.3% (14/42), 23.7% (45/190) and 17.4% (8/46) of the reports bleeding events related to SSRIs and anticoagulants, SSRIs and platelet aggregation inhibitors and SSRIs and NSAIDs were reported. Hyponatremia was reported in 7.6% (22/289) of the reports related to antidepressants and diuretics and increased beta-blocker effects in 6.9% (8/116) of the reports related to SSRIs and beta-blockers.
Conclusions
According to our analysis, well-known DDI still occur in the treatment of psychiatric patients with psychiatric drugs and somatic medication. Whenever possible, alternative drug combinations with a lower potential of DDIs may be considered or appropriate monitoring measures should be conducted.
Background: Frailty and sarcopenia predict worse surgical outcomes among spinal degenerative and deformity-related populations; this association is less clear in the context of spinal oncology. Here, we identified frailty and sarcopenia tools applied in spinal oncology and appraised their clinimetric properties. Methods: A systematic review was conducted from January 1st, 2000, until June 2022. Study characteristics, frailty tools, measures of sarcopenia, component domains, individual items, cut-off values, and measurement techniques were collected. Clinimetric assessment was performed according to Consensus-based Standards for Health Measurement Instruments. Results: Twenty-two studies were included (42,514 patients). The three most employed frailty tools were the Metastatic Spine tumor Frailty Index (MSTFI), Modified Frailty Index-11 (mFI-11), and the mFI-5. The three most common sarcopenia measures were the L3-Total Psoas Area (TPA)/Vertebral Body Area (VBA), L3-TPA/Height2, and L3-Spinal Muscle Index (L3-Cross-Sectional Muscle Area/Height2). Frailty and sarcopenia measures lacked content and construct validity. Positive predictive validity was observed in select studies employing the HFRS, mFI-5, MSTFI, and L3-TPA/VBA. All frailty tools had floor or ceiling effects. Conclusions: Existing tools for evaluating frailty and sarcopenia in surgical spine oncology have poor clinimetric properties. Here, we provide a pragmatic approach to utilizing existing frailty and sarcopenia tools, until more clinimetrically robust instruments are developed.
Eustachian tube dysfunction is prevalent in both paediatric and adult populations. Current clinical guidelines recommend observation over topical intranasal corticosteroids for Eustachian tube dysfunction management, which remains controversial. This study aimed to systematically review randomised, controlled trials assessing topical intranasal corticosteroid efficacy in Eustachian tube dysfunction, and analyse effect through tympanometric normalisation.
Methods
PubMed, EMBASE, Web of Science and Cochrane Library databases were searched. All randomised, controlled trials assessing intranasal corticosteroids in adult or paediatric Eustachian tube dysfunction patients were included. A meta-analysis of proportions was used to evaluate tympanogram normalisation.
Results
Of 330 results, eight randomised, controlled trials met inclusion criteria and underwent qualitative data synthesis and risk-of-bias analysis. Meta-analysis of tympanometry data from four eligible trials (n = 512 ears) revealed no significant difference in tympanometric normalisation between intranasal corticosteroids and control (odds ratio 1.21, 95% confidence interval 0.65–2.24).
Conclusion
Study results do not strongly support intranasal corticosteroids for Eustachian tube dysfunction. Data were limited, emphasising the need for larger, higher quality, randomised, controlled trials.
We performed a literature review to describe the risk of surgical-site infection (SSI) in minimally invasive surgery (MIS) compared to standard open surgery. Most studies reported decreased SSI rates among patients undergoing MIS compared to open procedures. However, many were observational studies and may have been affected by selection bias. MIS is associated with reduced risk of surgical-site infection compared to standard open surgery and should be considered when feasible.
We evaluated diagnostic test and antibiotic utilization among 252 patients from 11 US hospitals who were evaluated for coronavirus disease 2019 (COVID-19) pneumonia during the severe acute respiratory coronavirus virus 2 (SARS-CoV-2) omicron variant pandemic wave. In our cohort, antibiotic use remained high (62%) among SARS-CoV-2–positive patients and even higher among those who underwent procalcitonin testing (68%).
Spatially resolved transcriptomics (SRT) is a growing field that links gene expression to anatomical context. SRT approaches that use next-generation sequencing (NGS) combine RNA sequencing with histological or fluorescent imaging to generate spatial maps of gene expression in intact tissue sections. These technologies directly couple gene expression measurements with high-resolution histological or immunofluorescent images that contain rich morphological information about the tissue under study. While broad access to NGS-based spatial transcriptomic technology is now commercially available through the Visium platform from the vendor 10× Genomics, computational tools for extracting image-derived metrics for integration with gene expression data remain limited. We developed VistoSeg as a MATLAB pipeline to process, analyze and interactively visualize the high-resolution images generated in the Visium platform. VistoSeg outputs can be easily integrated with accompanying transcriptomic data to facilitate downstream analyses in common programing languages including R and Python. VistoSeg provides user-friendly tools for integrating image-derived metrics from histological and immunofluorescent images with spatially resolved gene expression data. Integration of this data enhances the ability to understand the transcriptional landscape within tissue architecture. VistoSeg is freely available at http://research.libd.org/VistoSeg/.
Background: Frailty is increasingly recognized for an association with adverse events, mortality, and hospital discharge disposition among surgical patients. The purpose of this study was to describe how spinal surgeons conceptualize, define, and assess frailty in the context of spinal metastatic disease (SMD). Methods: We conducted an international, cross-sectional, 33-question survey of the AO Spine community. The survey was developed using a modified Delphi technique and was designed to elucidate preoperative surrogate markers of frailty in the context of SMD. Responses were ranked using weighted averages. Consensus was defined as ≥ 70% agreement among respondents. Results: Results were analyzed for 312 respondents (86% completion rate). Study participants represented 71 countries. Most respondents informally assess frailty in patients with SMD by forming a general perception based on clinical condition and patient history. Consensus was attained regarding the association between 14 clinical variables and frailty. Severe comorbidities, systemic disease burden, and poor performance status were most associated with frailty; severe comorbidities included high-risk cardio-pulmonary disease, renal failure, liver failure, and malnutrition. Conclusions: Surgeons recognized frailty is important but commonly evaluate it based on general clinical impression rather than using existing frailty tools. We identified preoperative surrogate markers of frailty perceived as most relevant in this population.
Diagnosing the evolution of laser-generated high energy density (HED) systems is fundamental to develop a correct understanding of the behavior of matter under extreme conditions. Talbot–Lau interferometry constitutes a promising tool, since it permits simultaneous single-shot X-ray radiography and phase-contrast imaging of dense plasmas. We present the results of an experiment at OMEGA EP that aims to probe the ablation front of a laser-irradiated foil using a Talbot–Lau X-ray interferometer. A polystyrene (CH) foil was irradiated by a laser of 133 J, 1 ns and probed with 8 keV laser-produced backlighter radiation from Cu foils driven by a short-pulse laser (153 J, 11 ps). The ablation front interferograms were processed in combination with a set of reference images obtained ex situ using phase-stepping. We managed to obtain attenuation and phase-shift images of a laser-irradiated foil for electron densities above ${10}^{22}\;{\mathrm{cm}}^{-3}$. These results showcase the capabilities of Talbot–Lau X-ray diagnostic methods to diagnose HED laser-generated plasmas through high-resolution imaging.
To evaluate the impact of a diagnostic stewardship intervention on Clostridioides difficile healthcare-associated infections (HAI).
Design:
Quality improvement study.
Setting:
Two urban acute care hospitals.
Interventions:
All inpatient stool testing for C. difficile required review and approval prior to specimen processing in the laboratory. An infection preventionist reviewed all orders daily through chart review and conversations with nursing; orders meeting clinical criteria for testing were approved, orders not meeting clinical criteria were discussed with the ordering provider. The proportion of completed tests meeting clinical criteria for testing and the primary outcome of C. difficile HAI were compared before and after the intervention.
Results:
The frequency of completed C. difficile orders not meeting criteria was lower [146 (7.5%) of 1,958] in the intervention period (January 10, 2022–October 14, 2022) than in the sampled 3-month preintervention period [26 (21.0%) of 124; P < .001]. C. difficile HAI rates were 8.80 per 10,000 patient days prior to the intervention (March 1, 2021–January 9, 2022) and 7.69 per 10,000 patient days during the intervention period (incidence rate ratio, 0.87; 95% confidence interval, 0.73–1.05; P = .13).
Conclusions:
A stringent order-approval process reduced clinically nonindicated testing for C. difficile but did not significantly decrease HAIs.
The use of farrowing crates is increasingly questioned from an animal welfare point of view. Even so, since a number of attempts to develop loose farrowing systems have been unsuccessful, leading to high levels of piglet mortality due in the main to crushing, many farmers remain sceptical as to whether or not alternative systems can be viable. On the other hand, several European countries have introduced legislation requiring loose farrowing systems, thus promoting research into this type of housing and allowing for performance studies based on large samples of commercial farms. As a consequence of these recent developments, we think it timely to reconsider the evidence available on loose farrowing systems. In our review, we first address the normal peri-parturient behaviour of domestic pigs, as well as studies comparing behaviour and stress physiology in sows kept in both crates and loose systems during farrowing. We then review approaches taken to develop alternative farrowing systems in different countries, and focus lastly on pen, piglet and sow characteristics that contribute to piglet survival in loose farrowing systems. Taking scientific evidence as well as practical experience into account, we conclude that piglet mortality in loose farrowing systems need not exceed that of crate systems. To obtain good performance results, sows due to farrow should be kept individually in sufficiently large pens, structured for preference into a nest area and an activity area. Furthermore, both management and breeding aspects, resulting in high piglet viability and good maternal behaviour, are essential to achieve high production in loose farrowing systems.
Crating sows in farrowing systems greatly restricts their normal behaviour, which is usually justified by the assumption that piglet mortality is higher with loose-housed sows. Based on experiments showing that this is not the case, farrowing crates were banned in Switzerland in 1997. Since then, many farms have introduced loose farrowing systems, enabling a comparison of piglet mortality in farrowing systems with and without crates based on a large sample size. Data of a sow-recording scheme (UFA2000) were analysed using generalised linear mixed-effects models with an underlying Poisson distribution. In 2002 and 2003, the average total piglet mortality on 173 farms (n = 18,824 litters) with loose farrowing systems amounted to 1.40 piglets per litter and did not differ from that of 482 farms (n = 44,837 litters) with crates (1.42 piglets per litter). Nevertheless, the number of crushed piglets was significantly higher in pens with loose-housed sows (0.62 versus 0.52 piglets per litter), whereas the number of piglets that died for other reasons was significantly higher in crates (0.78 versus 0.89 piglets per litter). Total piglet mortality was influenced by litter size at birth, age of the sow and season. Consequently, evaluation of the reproductive data of commercial farms shows that no more piglet losses occur in loose farrowing pens, common nowadays in Switzerland, than in farrowing pens with crates, and that litter size at birth is the main influence on piglet losses.
Morally challenging decisions tend to be perceived as difficult by decision makers and often lead to post-decisional worry or regret. To test potential causes of these consequences, we employed realistic, morally challenging scenarios with two conflicting choice options. In addition to respondents’ choices, we collected various ratings of choice options, decision-modes employed, as well as physiological arousal, assessed via skin conductance. Not surprisingly, option ratings predicted choice, such that the more positively rated option was chosen. However, respondents’ self-reported decision modes also independently predicted choice. We further found that simultaneously engaging in decision modes that predict opposing choices increased decision difficulty and post-decision worry. In some cases this was related to increased arousal. Results suggest that at least a portion of the negative consequences associated with morally challenging decisions can be attributed to conflict in the decision modes one engages in.
Testing of asymptomatic patients for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) (ie, “asymptomatic screening) to attempt to reduce the risk of nosocomial transmission has been extensive and resource intensive, and such testing is of unclear benefit when added to other layers of infection prevention mitigation controls. In addition, the logistic challenges and costs related to screening program implementation, data noting the lack of substantial aerosol generation with elective controlled intubation, extubation, and other procedures, and the adverse patient and facility consequences of asymptomatic screening call into question the utility of this infection prevention intervention. Consequently, the Society for Healthcare Epidemiology of America (SHEA) recommends against routine universal use of asymptomatic screening for SARS-CoV-2 in healthcare facilities. Specifically, preprocedure asymptomatic screening is unlikely to provide incremental benefit in preventing SARS-CoV-2 transmission in the procedural and perioperative environment when other infection prevention strategies are in place, and it should not be considered a requirement for all patients. Admission screening may be beneficial during times of increased virus transmission in some settings where other layers of controls are limited (eg, behavioral health, congregate care, or shared patient rooms), but widespread routine use of admission asymptomatic screening is not recommended over strengthening other infection prevention controls. In this commentary, we outline the challenges surrounding the use of asymptomatic screening, including logistics and costs of implementing a screening program, and adverse patient and facility consequences. We review data pertaining to the lack of substantial aerosol generation during elective controlled intubation, extubation, and other procedures, and we provide guidance for when asymptomatic screening for SARS-CoV-2 may be considered in a limited scope.
On Monday, 20 October 2014, Renée Zellweger, the actor largely known for playing the ‘wanton sex goddess’and famously flawed Bridget Jones, shocked the world with an entirely new appearance that many argued made her unrecognisable. Appearing at Elle magazine's annual Women in Hollywood awards, Zellweger's face evoked ‘audible gasps’ from the audience and paparazzi. The New York Daily News quoted an anonymous source, ‘When people got up close to her, they were taken back by what she had done to her face. Everyone was whispering about how different she looked.’ Seemingly, the rest of the press was in full agreement. ‘Renée Zellweger looks unrecognisable,’ noted the Huffington Post, UK. While still conventionally attractive with long blonde hair and luminously white wrinkle-free skin, press accounts agreed that she didn't look like ‘herself’. Indeed, later in the day Huff Post UK published a list of five stars who looked more like Renée Zellweger than Renée Zellweger: Juliette Lewis, Christina Applegate, Cameron Diaz, Sarah Jessica Parker and Rosie Huntington Whitely. Using journalist Amanda Hess's turn of phrase, Zellweger had become reduced to an imitation of herself. The consensus among all reports was that Zellweger had not only made use of but that she had over-indulged in age-defying technologies (specifically plastic surgery), turning herself into a sign board for excess and a cautionary tale about female vanity and celebrity culture. As I will note further in this essay, Zellweger resolutely denied that her appearance was the result of plastic surgery, arguing instead that she was simply happier, an emotional state that had written itself ‘naturally’ on her face. ‘Did she or didn't she?’ became the rallying cry around Renée Zellweger, a question intent on getting at the truth represented by her face.
The title of my essay is a turn on Pablo Picasso's famous musing about art:
We all know that Art is not truth. Art is a lie that makes us realise truth, at least the truth that is given us to understand. The artist must know how to convince others of the truthfulness of his lies.
To examine differences in surgical practices between salaried and fee-for-service (FFS) surgeons for two common degenerative spine conditions. Surgeons may offer different treatments for similar conditions on the basis of their compensation mechanism.
Methods:
The study assessed the practices of 63 spine surgeons across eight Canadian provinces (39 FFS surgeons and 24 salaried) who performed surgery for two lumbar conditions: stable spinal stenosis and degenerative spondylolisthesis. The study included a multicenter, ambispective review of consecutive spine surgery patients enrolled in the Canadian Spine Outcomes and Research Network registry between October 2012 and July 2018. The primary outcome was the difference in type of procedures performed between the two groups. Secondary study variables included surgical characteristics, baseline patient factors, and patient-reported outcome.
Results:
For stable spinal stenosis (n = 2234), salaried surgeons performed statistically fewer uninstrumented fusion (p < 0.05) than FFS surgeons. For degenerative spondylolisthesis (n = 1292), salaried surgeons performed significantly more instrumentation plus interbody fusions (p < 0.05). There were no statistical differences in patient-reported outcomes between the two groups.
Conclusions:
Surgeon compensation was associated with different approaches to stable lumbar spinal stenosis and degenerative lumbar spondylolisthesis. Salaried surgeons chose a more conservative approach to spinal stenosis and a more aggressive approach to degenerative spondylolisthesis, which highlights that remuneration is likely a minor determinant in the differences in practice of spinal surgery in Canada. Further research is needed to further elucidate which variables, other than patient demographics and financial incentives, influence surgical decision-making.