We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Disparities in access to palliative care persist, particularly among underserved populations. We elicited recommendations for integrating community health workers (CHWs) into clinical care teams, by exploring perspectives on potential barriers and facilitators, ultimately aiming to facilitate equitable access to palliative care.
Materials and Methods:
Twenty-five stakeholders were recruited for semi-structured interviews through purposive snowball sampling at three enrollment sites in the USA. Interviews were conducted to understand perspectives on the implementation of a CHW palliative care intervention for African American patients with advanced cancer. After transcription, primary and secondary coding were conducted. Framework analysis was utilized to refine the data, clarify themes, and generate recommendations for integrating CHWs into palliative care teams.
Results:
Our sample comprised 25 key informants, including 6 palliative care providers, 6 oncologists, 5 cancer center leaders, 2 cancer care navigators, and 6 CHWs. Thematic analysis revealed five domains of recommendations: (1) increasing awareness and understanding of the CHW role, (2) improving communication and collaboration, (3) ensuring access to resources, (4) enhancing CHW training, and (5) ensuring leadership support for integration. Informants shared barriers, facilitators, and recommendations within each domain based on their experiences.
Conclusion:
Barriers to CHW integration within palliative care teams included limited awareness of the CHW role and inadequate training opportunities, alongside practical and logistical challenges. Conversely, promoting CHW engagement, providing adequate training, and ensuring support from leadership have the potential to aid integration.
The global population and status of Snowy Owls Bubo scandiacus are particularly challenging to assess because individuals are irruptive and nomadic, and the breeding range is restricted to the remote circumpolar Arctic tundra. The International Union for Conservation of Nature (IUCN) uplisted the Snowy Owl to “Vulnerable” in 2017 because the suggested population estimates appeared considerably lower than historical estimates, and it recommended actions to clarify the population size, structure, and trends. Here we present a broad review and status assessment, an effort led by the International Snowy Owl Working Group (ISOWG) and researchers from around the world, to estimate population trends and the current global status of the Snowy Owl. We use long-term breeding data, genetic studies, satellite-GPS tracking, and survival estimates to assess current population trends at several monitoring sites in the Arctic and we review the ecology and threats throughout the Snowy Owl range. An assessment of the available data suggests that current estimates of a worldwide population of 14,000–28,000 breeding adults are plausible. Our assessment of population trends at five long-term monitoring sites suggests that breeding populations of Snowy Owls in the Arctic have decreased by more than 30% over the past three generations and the species should continue to be categorised as Vulnerable under the IUCN Red List Criterion A2. We offer research recommendations to improve our understanding of Snowy Owl biology and future population assessments in a changing world.
Pre-diagnostic stages of psychotic illnesses, including ‘clinical high risk’ (CHR), are marked by sleep disturbances. These sleep disturbances appear to represent a key aspect in the etiology and maintenance of psychotic disorders. We aimed to examine the relationship between self-reported sleep dysfunction and attenuated psychotic symptoms (APS) on a day-to-day basis.
Methods
Seventy-six CHR young people completed the Experience Sampling Methodology (ESM) component of the European Union Gene-Environment Interaction Study, collected through PsyMate® devices, prompting sleep and symptom questionnaires 10 times daily for 6 days. Bayesian multilevel mixed linear regression analyses were performed on time-variant ESM data using the brms package in R. We investigated the day-to-day associations between sleep and psychotic experiences bidirectionally on an item level. Sleep items included sleep onset latency, fragmentation, and quality. Psychosis items assessed a range of perceptual, cognitive, and bizarre thought content common in the CHR population.
Results
Two of the seven psychosis variables were unidirectionally predicted by previous night's number of awakenings: every unit increase in number of nightly awakenings predicted a 0.27 and 0.28 unit increase in feeling unreal or paranoid the next day, respectively. No other sleep variables credibly predicted next-day psychotic symptoms or vice-versa.
Conclusion
In this study, the relationship between sleep disturbance and APS appears specific to the item in question. However, some APS, including perceptual disturbances, had low levels of endorsement amongst this sample. Nonetheless, these results provide evidence for a unidirectional relationship between sleep and some APS in this population.
Contextual memory, which refers to the ability to remember spatial or temporal circumstances related to an event, is affected early in Alzheimer’s Disease (AD). Computerized cognitive tasks have been suggested to be an ecological way to assess memory, but there are few studies that utilize these tools. Studying contextual memory via a computerized task in a Colombian kindred with autosomal dominant AD due to the Presenilin-1 (PSEN1) E280A mutation and a well-characterized disease progression may help us understand contextual memory changes in the preclinical AD stage. In this study we investigated whether a novel computerized task examining contextual memory can help identify those at increased risk for dementia.
Participants and Methods:
A group of 31 non-carriers (mean age=38.97±6.11; mean education=11.45±4.34) and 15 cognitively unimpaired PSEN1E280A mutation carriers from the Colombia-Boston (COLBOS) Biomarker Study (mean age=35.67±5.50), mean education=10.60±3.83) performed the “MAPP Room Memory Task” on a computer. As part of this task, participants are asked to remember ten rooms and the specific location of a few objects for later recall. During the immediate recall phase, participants are asked to recognize the objects presented in each room (Immediate Object Recognition) and their location (Immediate Object Placement). During the subsequent delay phase of the task, participants are asked to select the correct room in which an object was first presented (Delayed Room Recognition) and place the objects previously seen in each room (Delayed Object Placement). We conducted Mann Whitney U tests to analyze differences between groups and Spearman Rho correlations to examine associations among the Room Memory Task performance, age, education, and Mini Mental State Examination (MMSE).
Results:
There were no differences in age or education between carriers and non-carriers (p>0.05, for both). Carriers had worse Delayed Room Recognition than non-carriers (Carriers mean score=0.893±0.18, non-carriers mean score=0.987±0.05; U=168.0, p=0.02), while there were no differences in the other task conditions (all p>0.05). In carriers, education was positively associated with Immediate Object Placement (rs=0.61, p=0.02), Delayed Object Placement (rs=0.76, p=0.001), and Delayed Room Recognition (rs=0.68, p=0.006). There were no significant associations between Room Memory Task conditions and age or MMSE scores in carriers. Further, no significant associations were observed between Room Memory Task performance, and age, education or MMSE scores in non-carriers.
Conclusions:
Our preliminary findings show that the MAPP Room Memory Task, in particular the Delayed Room Recognition condition, may be helpful to discriminate those at increased risk of dementia. Future studies with larger samples using the Room Memory Task and AD-related biomarkers are needed to examine whether this task can be sensitive to early preclinical changes associated with AD and can potentially help track disease progression in those at risk.
Cholestasis characterised by conjugated hyperbilirubinemia is a marker of hepatobiliary dysfunction following neonatal cardiac surgery. We aimed to characterise the incidence of conjugated hyperbilirubinemia following neonatal heart surgery and examine the effect of conjugated hyperbilirubinemia on post-operative morbidity and mortality.
Methods:
This was a retrospective study of all neonates who underwent surgery for congenital heart disease (CHD) at our institution between 1/1/2010 and 12/31/2020. Patient- and surgery-specific data were abstracted from local registry data and review of the medical record. Conjugated hyperbilirubinemia was defined as perioperative maximum conjugated bilirubin level > 1 mg/dL. The primary outcome was in-hospital mortality. Survival analysis was conducted using the Kaplan–Meier survival function.
Results:
Conjugated hyperbilirubinemia occurred in 8.5% of patients during the study period. Neonates with conjugated hyperbilirubinemia were more likely to be of younger gestational age, lower birth weight, and non-Caucasian race (all p < 0.001). Patients with conjugated hyperbilirubinemia were more likely to have chromosomal and non-cardiac anomalies and require ECMO pre-operatively. In-hospital mortality among patients with conjugated hyperbilirubinemia was increased compared to those without (odds ratio 5.4). Post-operative complications including mechanical circulatory support, reoperation, prolonged ventilator dependence, and multi-system organ failure were more common with conjugated hyperbilirubinemia (all p < 0.04). Patients with higher levels of conjugated bilirubin had worst intermediate-term survival, with patients in the highest conjugated bilirubin group (>10 mg/dL) having a 1-year survival of only 6%.
Conclusions:
Conjugated hyperbilirubinemia is associated with post-operative complications and worse survival following neonatal heart surgery. Cholestasis is more common in patients with chromosomal abnormalities and non-cardiac anomalies, but the underlying mechanisms have not been delineated.
For over fifty years, minorities in Burma have faced severe persecution and violence, forcing them to flee their homeland. In the past ten years there has been an influx in the number of refugees resettled in Denver, Colorado. Refugees often struggle to navigate the complexities of the American health care system and adapt to life in a foreign culture. The development of programs and partnerships to assist refugees in their pursuit of health and integration is essential to building stronger communities.
Objectives
This community based participatory research (CBPR) project was developed in collaboration with the refugee community from Burma living in the Denver area. After regular meetings with a group of motivated teenagers and young adults from this community to form our Youth Advisory Board (YAB), they identified alcohol use and misuse as a health concern within their community. With this identified issue, the project aimed to gather data from community members that could be leveraged to create, implement, and evaluate a culturally competent intervention to effectively address risky alcohol use in this community.
Methods
Data collection involved formal one-on-one, semi-structured, audio-recorded interviews with community members. Participants were recruited voluntarily at health information nights held by the student researchers at their local apartment complex. The interviews were conducted by one medical student researcher with one translator present and were transcribed afterward. The interview data was analyzed using Immersion Crystallization methodology.
Results
Initial results from the community meetings with the YAB, local organizations, formative community surveys, and key informant interviews highlighted the vulnerability of the refugee population, scarcity of culturally appropriate resources for alcohol abuse, and urgency of addressing problematic alcohol use. The analysis of the ten audio-recorded surveys showed several themes including negative consequences of alcohol use, specifically negative impacts on familial relationships, employment, and financial resources, and a perceived personal responsibility for managing one’s own alcohol consumption.
Conclusions
This project corroborates current literature regarding the scope and breadth of hazardous alcohol use within the community of refugees from Burma. Our data has expanded our understanding of the values of community members including the influence of religion and family on behaviors, and the negative impact on employment as the most impactful negative consequence. These findings need to be shared with the community to move forward in mapping the most effective and appropriate interventions.
Patients with unbalanced common atrioventricular canal can be difficult to manage. Surgical planning often depends on pre-operative echocardiographic measurements. We aimed to determine the added utility of cardiac MRI in predicting successful biventricular repair in common atrioventricular canal.
Methods:
We conducted a retrospective cohort study of children with common atrioventricular canal who underwent MRI prior to repair. Associations between MRI and echocardiographic measures and surgical outcome were tested using logistic regression, and models were compared using area under the receiver operator characteristic curve.
Results:
We included 28 patients (median age at MRI: 5.2 months). The optimal MRI model included the novel end-diastolic volume index (using the ratio of left ventricular end-diastolic volume to total end-diastolic volume) and the left ventricle–right ventricle angle in diastole (area under the curve 0.83, p = 0.041). End-diastolic volume index ≤ 0.18 and left ventricle–right ventricle angle in diastole ≤ 72° yield a sensitivity of 83% and specificity of 81% for successful biventricular repair. The optimal multimodality model included the end-diastolic volume index and the echocardiographic atrioventricular valve index with an area under the curve of 0.87 (p = 0.026).
Conclusions:
Cardiac MRI can successfully predict successful biventricular repair in patients with unbalanced common atrioventricular canal utilising the end-diastolic volume index alone or in combination with the MRI left ventricle–right ventricle angle in diastole or the echocardiographic atrioventricular valve index. A prospective cardiac MRI study is warranted to better define the multimodality characteristic predictive of successful biventricular surgery.
Intravenous (IV) anesthetics were first discovered for their clinical utility in 1656 by Sir Christopher Wren, an architect, physicist, and astronomer at the University of Oxford while using a goosequill to inject opium into a dog to produce sleep [1]. In 1909, Ludwig Burkhardt became the first surgeon to deliberately use IV ether in a 5% solution to sedate patients for head and neck surgery, finding that a higher concentration caused thrombophlebitis and hemolysis, whereas a lower concentration proved too weak a sedative. The first barbiturate hexobarbital was used in 1932, soon being used for over 10 million cases by 1944. In 1989, the first propofol lipid emulsion formulation was launched in the United States, marking the beginning of the modern age of IV sedation pharmacology [2].
OBJECTIVES/GOALS: Greater blood pressure (BP) reactivity and socioeconomic deprivation (e.g., area deprivation index; ADI) are associated with poor vascular health [1-3]. However, it is unclear if ADI is associated with BP reactivity. Thus, we sought to examine if ADI is associated with BP reactivity in young adults. METHODS/STUDY POPULATION: Participants completed questionnaires used to derive lifetime ADI averaged from early-, mid-childhood, and adolescence. Participants completed a handgrip (HG) exercise protocol including 10 minutes of rest, 2 minutes of static HG at 40% of their maximal voluntary contraction, 3 minutes of post-exercise ischemia (PEI), and 2 minutes of recovery (REC). Beat-to-beat BP (photoplethysmography) and heartrate (HR; electrocardiogram) were continually assessed. We used the Shapiro-Wilk test to assess data for normality. We examined associations between ADI, BP reactivity, and HR using unadjusted and body mass index (BMI), sex, and race-adjusted Pearson’s correlation (set a priori to 0.05). RESULTS/ANTICIPATED RESULTS: This study included 53 (27Males/26Females; 21 ± 1 years; 24Black/29White; BP 107 ± 9/64 ± 9 mmHg) participants. There were racial differences (Black compared to White adults) for several BP reactivity metrics (e.g., PEI minute 3 diastolic BP: 96 ± 15 vs. 84 ± 19 mmHg, p=0.014) and lifetime ADI (p0.050). DISCUSSION/SIGNIFICANCE: Our data suggest racial differences exist in socioeconomic deprivation in a modestly sized young adult sample living in the southeast. While additional data are needed for other stressors, socioeconomic deprivation was not independently associated with BP or HR reactivity during acute exercise.
This is a multi-wavelength study to examine the G45.804-0.355 massive star-forming region (SFR) and its environs. Using MeerKAT with angular resolution (θ) of 8″ at 1.28 GHz, we identify for the first time, a faint radio continuum emission core in G45.804-0.355. At 1.3 mm, ALMA observations (θ∼0″ 7) resolved the core into multiple dust continuum condensations including MM1 which was found to be the primary massive dust dense core in the region (mass Mc∼ 54.3M⊙). The dust continuum shows an arm-like extended emission within which other dense cores are situated. The velocity gradient of the MM1 core indicates that the source is associated with a rotation gas motion. The red- and blue-shifted lobes overlap at the position of MM1. The compact morphology of the 4.5 μm IR emission, the presence of spiral arms and overlapping of the red- and blue-shifted lobes suggest a face-on geometry of G45.80-40.355.
Despite a wide range of proposed risk factors and theoretical models, prediction of eating disorder (ED) onset remains poor. This study undertook the first comparison of two machine learning (ML) approaches [penalised logistic regression (LASSO), and prediction rule ensembles (PREs)] to conventional logistic regression (LR) models to enhance prediction of ED onset and differential ED diagnoses from a range of putative risk factors.
Method
Data were part of a European Project and comprised 1402 participants, 642 ED patients [52% with anorexia nervosa (AN) and 40% with bulimia nervosa (BN)] and 760 controls. The Cross-Cultural Risk Factor Questionnaire, which assesses retrospectively a range of sociocultural and psychological ED risk factors occurring before the age of 12 years (46 predictors in total), was used.
Results
All three statistical approaches had satisfactory model accuracy, with an average area under the curve (AUC) of 86% for predicting ED onset and 70% for predicting AN v. BN. Predictive performance was greatest for the two regression methods (LR and LASSO), although the PRE technique relied on fewer predictors with comparable accuracy. The individual risk factors differed depending on the outcome classification (EDs v. non-EDs and AN v. BN).
Conclusions
Even though the conventional LR performed comparably to the ML approaches in terms of predictive accuracy, the ML methods produced more parsimonious predictive models. ML approaches offer a viable way to modify screening practices for ED risk that balance accuracy against participant burden.
Vision and hearing impairments are highly prevalent in adults 65 years of age and older. There is a need to understand their association with multiple health-related outcomes. We analyzed data from the Resident Assessment Instrument for Home Care (RAI-HC). Home care clients were followed for up to 5 years and categorized into seven unique cohorts based on whether or not they developed new vision and/or hearing impairments. An absolute standardized difference (stdiff) of at least 0.2 was considered statistically meaningful. Most clients (at least 60%) were female and 34.9 per cent developed a new sensory impairment. Those with a new concurrent vison and hearing impairment were more likely than those with no sensory impairments to experience a deterioration in receptive communication (stdiff = 0.68) and in cognitive performance (stdiff = 0.49). After multivariate adjustment, they had a twofold increased odds (adjusted odds ratio [OR] = 2.1; 95% confidence interval [CI]:1,87, 2.35) of deterioration in cognitive performance. Changes in sensory functioning are common and have important effects on multiple health-related outcomes.
Studies have suggested that stress predicts both body dissatisfaction (BD) and disordered eating (DE) patterns. However, the mechanisms of this process are not entirely clear and could be elucidated through further exploration in daily life.
Objectives
The purpose of this study was to 1) explore the concurrent and lagged relationship between stress and BD in the daily life of individuals with differing levels of trait eating pathology (EP) and 2) to investigate whether maladaptive coping moderated these relationships.
Methods
107 female participants (mean age = 26.92) completed an online survey about stress, coping strategies and trait EP. Participants used a smartphone app to report on state stress, BD and DE six times a day for seven days
Results
Individuals with elevated trait EP experienced a significantly higher frequency of stress events (b = 0.04). Participants’ use of maladaptive coping significantly increased state stress (b = 0.41), but was not moderated by EP. Participants’ state stress and BD measured at the same time point (concurrent assessment) were significantly related (b = 0.13). Either stress or BD at the previous time point did not significantly predict changes in the other (lagged assessment, b = 0.02, b = -0.09, respectively). The aforementioned state-based associations were not moderated by trait EP
Conclusions
Women with more severe EP were found to experience stress more frequently. Maladaptive coping strategies were related to stress, but not moderated by EP. The association between stress and BD from concurrent but not lagged assessment highlights the importance of assessing and targeting momentary stress levels.
Objectification theory argues that self-objectification confers risk for disordered eating (DE) both directly, and indirectly through a cascade of negative psychological consequences (e.g. low mood and self-conscious body monitoring). Robust cross-sectional evidence supports these relationships. However, these cross-sectional studies do not provide evidence for the complex intraindividual psychological processes outlined in objectification theory which purportedly contribute to DE.
Objectives
Using an ecological momentary assessment design, the current study investigated the direct within-person effect between state self-objectification and DE and examined the indirect within-person effect of negative mood and body comparisons, on the relationship between state self-objectification and DE.
Methods
Two-hundred female participants (M=20.43 years, SD=4.60) downloaded a smartphone app which assessed momentary experiences of self-objectification, mood, body comparisons, and DE six times per day at random intervals for seven days.
Results
Indicated that self-objectification significantly predicted DE behaviours [95% CI 0.01, 0.03] and body comparisons [95% CI 0.32, 0.41]. However, the indirect effect of body comparisons on the relationship between state self-objectification and DE was not significant [95% CI -0.01, 0.00]. In the second mediation model, self-objectification significantly predicted DE behaviours [95% CI 0.01, 0.03], but did not significantly predict mood [95% CI -0.06, 0.03]. Similarly, the indirect effect of mood on the relationship between state self-objectification and DE was not significant [95% CI -0.00, 0.00].
Conclusions
These results enhance our understanding of objectification theory and suggest that self-objectification confers risk to DE directly. However, our findings do not support the indirect effect of self-objectification on DE through low mood or body comparisons.
The DSM-5 introduced severity indicies for the first time.
Objectives
We conducted a systematic review and synthesis the frequency of each DSM-5 severity categories (i.e., mild, moderate, severe and extreme severe) for Anorexia Nervosa (AN), Bulimia Nervosa (BN) and Binge Eating Disorders (BED), and to evaluate studies that assess the clinical utility of these severity specifiers for all eating disorders (ED) subtypes.
Methods
Five databases (EMBASE, MEDLINE, PsycARTICLES, PsycINFO, and ProQuest) were used to identify for both academic and grey literature published from 2013 until July 8, 2020. Twenty-five studies were retrieved for the systematic review based on the inclusion and exclusion criteria, and up to six studies were qualified for meta-analysis
Results
We found limited support for the current DSM-5 severity ratings for all ED indices, as the majority of ED severity groups were not significantly distinguishable in overall ED psychopathology (mean effect size ranged from .02 to .5). The value of the DSM-5 severity ratings was further devalued as 56.91% to 80.52% of individuals with AN, BN, and BED were categorized into mild and moderate groups. However, there was significant heterogeneity between the studies (p< .001), and some of these heterogeneities were explained by differences in study settings and measurement of eating disorder psychopathology.
Conclusions
Overall, the current study provided little support for the DSM-5 severity ratings for EDs, thus it is suggested that further exploration in alternative severity classification approach is needed.
Veganism has increased in popularity in the past decade and, despite being a characteristic protected by law, is often viewed negatively by the general population. Little is known about the attitudes of healthcare professionals despite the potential influence on practice and eating disorder patient care. This is one of the first studies to investigate attitudes toward veganism within specialist eating disorder, general mental health and other professionals.
Results
A one-way ANOVA indicated all professionals held positive views toward veganism. General mental health professionals held statistically more positive veganism attitudes than specialist eating disorder and other professionals.
Clinical implications
As one of the first studies to suggest eating disorder professionals are not biased against veganism, it has important clinical practice implications, particularly when exploring motivations for adopting a vegan diet (health, weight loss, environmental or animal welfare concerns) in patients with eating disorders. Implications for further research are provided.
ABSTRACT IMPACT: This work will standardize necessary image pre-processing for diagnostic and prognostic clinical workflows dependent on quantitative analysis of conventional magnetic resonance imaging. OBJECTIVES/GOALS: Conventional magnetic resonance imaging (MRI) poses challenges for quantitative analysis due to a lack of uniform inter-scanner voxel intensity values. Head and neck cancer (HNC) applications in particular have not been well investigated. This project aims to systematically evaluate voxel intensity standardization (VIS) methods for HNC MRI. METHODS/STUDY POPULATION: We utilize two separate cohorts of HNC patients, where T2-weighted (T2-w) MRI sequences were acquired before beginning radiotherapy for five patients in each cohort. The first cohort corresponds to patients with images taken at various institutions with a variety of non-uniform acquisition scanners and parameters. The second cohort corresponds to patients from a prospective clinical trial with uniformity in both scanner and acquisition parameters. Regions of interest from a variety of healthy tissues assumed to have minimal interpatient variation were manually contoured for each image and used to compare differences between a variety of VIS methods for each cohort. Towards this end, we implement a new metric for cohort intensity distributional overlap to compare region of interest similarity in a given cohort. RESULTS/ANTICIPATED RESULTS: Using a simple and interpretable metric, we have systematically investigated the effects of various commonly implementable VIS methods on T2-w sequences for two independent cohorts of HNC patients based on region of interest intensity similarity. We demonstrate VIS has a substantial effect on T2-w images where non-uniform acquisition parameters and scanners are utilized. Oppositely, it has a modest to minimal impact on T2-w images generated from the same scanner with the same acquisition parameters. Moreover, with a few notable exceptions, there does not seem to be a clear advantage or disadvantage to using one VIS method over another for T2-w images with non-uniform acquisition parameters. DISCUSSION/SIGNIFICANCE OF FINDINGS: Our results inform which VIS methods should be favored in HNC MRI and may indicate VIS is not a critical factor to consider in circumstances where similar acquisition parameters can be utilized. Moreover, our results can help guide downstream quantitative imaging tasks that may one day be implemented in clinical workflows.
The purpose of this study was to describe the prevalence of hearing loss (HL), vision loss (VL), and dual sensory loss (DSL) in Canadians 45–85 years of age. Audiometry and visual acuity were measured. Various levels of impairment severity were described. Results were extrapolated to the 2016 Canadian population. In 2016, 1,500,000 Canadian males 45–85 years of age had at least mild HL, 1,800,000 had at least mild VL, and 570,000 had DSL. Among females, 1,200,000 had at least mild HL, 2,200,000 had at least mild VL, and 450,000 had DSL. Among Canadians 45–85 years of age, mild, moderate, and severe HL was prevalent among 13.4 per cent, 3.7 per cent, and 0.4 per cent of males, and among 11.3 per cent, 2.3 per cent, and 0.2 per cent of females, respectively. Mild and moderate, or severe VL was prevalent among 19.8 per cent and 2.4 per cent of males, and among 23.9 per cent and 2.6 per cent of females, respectively. At least mild DSL was prevalent among 6.4 per cent of males and 6.1 per cent of females.