We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Nipocalimab is a human IgG1 monoclonal antibody targeting FcRn that selectively reduces IgG levels without impacting antigen presentation, T- and B-cell functions. This study describes the effect of nipocalimab on vaccine response. Methods: Open-label, parallel, interventional study randomized participants 1:1 to receive intravenous 30mg/kg nipocalimab at Week0 and 15mg/kg at Week2 and Week4 (active) or no drug (control). On Day 3, participants received Tdap and PPSV®23 vaccinations and were followed through Wk16. Results: Twenty-nine participants completed the study and are included (active, n=15; control, n=14). Participants with a positive anti-tetanus IgG response was comparable between groups at Wk2 and Wk16, but lower at Wk4 (nipocalimab 3/15 [20%] vs control 7/14 [50%]; P=0.089). All maintained anti-tetanus IgG above the protective threshold (0.16IU/mL) through Wk16. While anti-pneumococcal-capsular-polysaccharide (PCP) IgG levels were lower during nipocalimab treatment, the percent increase from baseline at Wk2 and Wk16 was comparable between groups. Post-vaccination, anti-PCP IgG remained above 50mg/L and showed a 2-fold increase from baseline throughout the study in both groups. Nipocalimab co-administration with vaccines was safe and well-tolerated. Conclusions: These findings suggest that nipocalimab does not impact the development of an adequate IgG response to T-cell–dependent/independent vaccines and that nipocalimab-treated patients can follow recommended vaccination schedules.
New South Wales (NSW) Health is committed to enhancing child health and development during the first 2000 days (conception to 5 years)(1). However, in Australia current child health behaviours indicate the need for further improvements. For example, discretionary foods (contributing high amounts of saturated fat, energy, added sodium and sugar) account for approximately 30% of total energy intakes in 2–3 years olds including the consumption of sugar sweetened beverages (SSB)(2). There remains a need to provide all parents raising children with direct and sustained support from birth to maximise health behaviours during this important life stage. Healthy Beginnings for HNEKids (HB4HNEKids) is an innovative text messaging program designed to be integrated into the usual care provided by Child and Family Health Nursing (CFHN) services. The messages were co-designed with key stakeholders to provide age-and-stage relevant preventive health information to parents/carers during the first 2000 days. HB4HNEKids has been piloted within five diverse CFHN services within the Hunter New England (HNE) local health district of NSW, reaching over 6000 families since its launch. However, the efficacy of the program on child health behaviours has not yet been explored. The aim of this study is to explore if families that received the HB4HNEKids program report reduced frequency of child discretionary food intake and a lower prevalence of SSB exposure, compared to families who did not receive the program. A cross-sectional survey of mothers 12–14 months post-partum was conducted between August 2023 and July 2024 including participants that received HB4HNEKids and a concurrent non-randomised comparison group, located in HNE. Mothers were asked to report on the frequency of child discretionary food intake per week, and whether their child had ever received SSB (including sweetened water, cordial, fruit drink, and soft-drinks). We conducted linear regression and logistic regression analyses to explore differences between the intervention and comparison participants. A total of 283 participants completed the survey, including 104 (37%) participants that had received the HB4HNEKids program. In infants aged 12–14 months, the frequency of discretionary food intake was approximately 1 serve per week and was unchanged based on if the family had received the HB4HNEKids program or not. Despite a 6-point prevalence difference in SSB exposure reported between groups (HB4HNEKids: 19.42% vs Comparison: 26.26%), this difference was not statistically significant (OR: 0.68 (95% CI: 0.37, 1.23), p = 0.2). Australian infant feeding guidelines suggest that the consumption of nutrient poor discretionary foods and sugar sweetened beverages should be avoided or limited(3). The HB4HNEKids program demonstrates some promise for improving infant feeding behaviours, however a larger effectiveness trial is required to ensure the evaluation is adequately powered.
Bell’s palsy is acute facial palsy due to inflammation involving the facial nerve related to infections. Rates have not been noted to differ by ethnicity. We studied the lifetime prevalence in First Nations and all other Manitobans in people with type 2 diabetes mellitus aged 7 and older in 2013–2014 and 2016–2017. We found a crude lifetime prevalence of 9.9% [95% CI 9.4–10.4%] in the First Nations population versus 3.9% [95% CI 3.8–4.0%] in all other Manitobans. It is unknown if there were differences in glycemic control. The increased prevalence was found in all five provincial health regions. This study indicates that ethnicity may be an important risk factor for Bell’s palsy.
Geotechnical drilling for a tunnel between Port Moody and Burnaby, BC, Canada, uncovered a buried fjord. Its sedimentary fill has a thickness of at least 130 m and extends more than 37 m below present mean sea level. Recovered sediments record cyclical growth and decay of successive Cordilleran ice sheets. The oldest sediments comprise 58 m of almost stoneless silt conformably overlying ice-proximal sediments and till, which in turn overlie bedrock. These sediments may predate Marine Isotope Stage (MIS) 4. Glacial sediments assigned to MIS 4 overlie this basal succession and, in turn, are overlain by MIS 3 interstadial sediments and sediments from two MIS 2 glacial advances. Indicators of relative sea-level elevations that bracket glacial deposits of MIS 4 and 2 indicate the cyclic existence of moat-like isostatic depressions in the front of expanding ice sheets. Compared with present sea level, these depressions were at least 160 m during the onsets of MIS 4 and MIS 2. Assuming a maximum eustatic drawdown of 120 m during MIS 2, isostatic depression may have exceeded 200 m during retreat of glacial ice from the Evergreen tunnel area. This is consistent with region-specific low mantle viscosity and rapid Cordilleran Ice Sheet buildup and wasting.
Deferration by reduction of free Fe2O3 with Na2S2O4 in the presence of Na citrate and NaHCO3 caused a change in valence state of 10 to 35 per cent of the total structural iron in micaceous vermiculites, soils, nontronite, and muscovite. An increase in Fe2+ on deferration was accompanied by an equivalent decrease in Fe3+. Subsequent treatment with H2O2 reoxidized the structural Fe2+ previously formed.
Sesquioxide coatings on micaceous vermiculites were examined electron microscopically. These coatings were composed predominantly of Fe2O3 with approximately 10 per cent by weight of Al2O3 and small amounts of SiO2, as determined by chemical analysis of the deferration extracts.
The cation exchange capacity (CEC) increased 10–60 per cent as a result of deferration of micaceous vermiculites and soils. Treatment of the deferrated sample with H2O2 restored the Fe3+ content to approximately the original value but the CEC was not affected. Consequently, the increase in CEC on deferration was attributed to the removal of the positively charged sesquioxide coating. The reversible change in valence of structural iron without an equivalent change in CEC was attributed to deprotonation-protonation of the structure (OH− ⇄ O2−) simultaneous with the oxidation-reduction of iron (Fe2+ ⇄ Fe3+) in the phyllosilicate layer.
Although the link between alcohol involvement and behavioral phenotypes (e.g. impulsivity, negative affect, executive function [EF]) is well-established, the directionality of these associations, specificity to stages of alcohol involvement, and extent of shared genetic liability remain unclear. We estimate longitudinal associations between transitions among alcohol milestones, behavioral phenotypes, and indices of genetic risk.
Methods
Data came from the Collaborative Study on the Genetics of Alcoholism (n = 3681; ages 11–36). Alcohol transitions (first: drink, intoxication, alcohol use disorder [AUD] symptom, AUD diagnosis), internalizing, and externalizing phenotypes came from the Semi-Structured Assessment for the Genetics of Alcoholism. EF was measured with the Tower of London and Visual Span Tasks. Polygenic scores (PGS) were computed for alcohol-related and behavioral phenotypes. Cox models estimated associations among PGS, behavior, and alcohol milestones.
Results
Externalizing phenotypes (e.g. conduct disorder symptoms) were associated with future initiation and drinking problems (hazard ratio (HR)⩾1.16). Internalizing (e.g. social anxiety) was associated with hazards for progression from first drink to severe AUD (HR⩾1.55). Initiation and AUD were associated with increased hazards for later depressive symptoms and suicidal ideation (HR⩾1.38), and initiation was associated with increased hazards for future conduct symptoms (HR = 1.60). EF was not associated with alcohol transitions. Drinks per week PGS was linked with increased hazards for alcohol transitions (HR⩾1.06). Problematic alcohol use PGS increased hazards for suicidal ideation (HR = 1.20).
Conclusions
Behavioral markers of addiction vulnerability precede and follow alcohol transitions, highlighting dynamic, bidirectional relationships between behavior and emerging addiction.
Clinical outcomes of repetitive transcranial magnetic stimulation (rTMS) for treatment of treatment-resistant depression (TRD) vary widely and there is no mood rating scale that is standard for assessing rTMS outcome. It remains unclear whether TMS is as efficacious in older adults with late-life depression (LLD) compared to younger adults with major depressive disorder (MDD). This study examined the effect of age on outcomes of rTMS treatment of adults with TRD. Self-report and observer mood ratings were measured weekly in 687 subjects ages 16–100 years undergoing rTMS treatment using the Inventory of Depressive Symptomatology 30-item Self-Report (IDS-SR), Patient Health Questionnaire 9-item (PHQ), Profile of Mood States 30-item, and Hamilton Depression Rating Scale 17-item (HDRS). All rating scales detected significant improvement with treatment; response and remission rates varied by scale but not by age (response/remission ≥ 60: 38%–57%/25%–33%; <60: 32%–49%/18%–25%). Proportional hazards models showed early improvement predicted later improvement across ages, though early improvements in PHQ and HDRS were more predictive of remission in those < 60 years (relative to those ≥ 60) and greater baseline IDS burden was more predictive of non-remission in those ≥ 60 years (relative to those < 60). These results indicate there is no significant effect of age on treatment outcomes in rTMS for TRD, though rating instruments may differ in assessment of symptom burden between younger and older adults during treatment.
Evidence from previous research suggests that frame-of-reference (FOR) training is effective at improving assessor ratings in many organizational settings. Yet no research has presented a thorough examination of systematic sources of variance (assessor-related effects, evaluation settings, and measurement design features) that might influence training effectiveness. Using a factorial ANOVA and variance components analyses on a database of four studies of frame-of-reference assessor training, we found that (a) training is most effective at identifying low levels of performance and (b) the setting of the training makes little difference with respect to training effectiveness. We also show evidence of the importance of rater training as a key determinant of the quality of performance ratings in general. Implications for FOR training theory and practice are discussed.
Diets deficient in fibre are reported globally. The associated health risks of insufficient dietary fibre are sufficiently grave to necessitate large-scale interventions to increase population intake levels. The Danish Whole Grain Partnership (DWP) is a public–private enterprise model that successfully augmented whole-grain intake in the Danish population. The potential transferability of the DWP model to Slovenia, Romania and Bosnia-Herzegovina has recently been explored. Here, we outline the feasibility of adopting the approach in the UK. Drawing on the collaborative experience of DWP partners, academics from the Healthy Soil, Healthy Food, Healthy People (H3) project and food industry representatives (Food and Drink Federation), this article examines the transferability of the DWP approach to increase whole grain and/or fibre intake in the UK. Specific consideration is given to the UK’s political, regulatory and socio-economic context. We note key political, regulatory, social and cultural challenges to transferring the success of DWP to the UK, highlighting the particular challenge of increasing fibre consumption among low socio-economic status groups – which were also most resistant to interventions in Denmark. Wholesale transfer of the DWP model to the UK is considered unlikely given the absence of the key ‘success factors’ present in Denmark. However, the DWP provides a template against which a UK-centric approach can be developed. In the absence of a clear regulatory context for whole grain in the UK, fibre should be prioritised and public–private partnerships supported to increase the availability and acceptability of fibre-rich foods.
This study investigated sex differences in Fe status, and associations between Fe status and endurance and musculoskeletal outcomes, in military training. In total, 2277 British Army trainees (581 women) participated. Fe markers and endurance performance (2·4 km run) were measured at the start (week 1) and end (week 13) of training. Whole-body areal body mineral density (aBMD) and markers of bone metabolism were measured at week 1. Injuries during training were recorded. Training decreased Hb in men and women (mean change (–0·1 (95 % CI –0·2, –0·0) and –0·7 (95 % CI –0·9, –0·6) g/dl, both P < 0·001) but more so in women (P < 0·001). Ferritin decreased in men and women (–27 (95 % CI –28, –23) and –5 (95 % CI –8, –1) µg/l, both P ≤ 0·001) but more so in men (P < 0·001). Soluble transferrin receptor increased in men and women (2·9 (95 % CI 2·3, 3·6) and 3·8 (95 % CI 2·7, 4·9) nmol/l, both P < 0·001), with no difference between sexes (P = 0·872). Erythrocyte distribution width increased in men (0·3 (95 % CI 0·2, 0·4)%, P < 0·001) but not in women (0·1 (95 % CI –0·1, 0·2)%, P = 0·956). Mean corpuscular volume decreased in men (–1·5 (95 % CI –1·8, –1·1) fL, P < 0·001) but not in women (0·4 (95 % CI –0·4, 1·3) fL, P = 0·087). Lower ferritin was associated with slower 2·4 km run time (P = 0·018), sustaining a lower limb overuse injury (P = 0·048), lower aBMD (P = 0·021) and higher beta C-telopeptide cross-links of type 1 collagen and procollagen type 1 N-terminal propeptide (both P < 0·001) controlling for sex. Improving Fe stores before training may protect Hb in women and improve endurance and protect against injury.
Frontal ablation, the combination of submarine melting and iceberg calving, changes the geometry of a glacier's terminus, influencing glacier dynamics, the fate of upwelling plumes and the distribution of submarine meltwater input into the ocean. Directly observing frontal ablation and terminus morphology below the waterline is difficult, however, limiting our understanding of these coupled ice–ocean processes. To investigate the evolution of a tidewater glacier's submarine terminus, we combine 3-D multibeam point clouds of the subsurface ice face at LeConte Glacier, Alaska, with concurrent observations of environmental conditions during three field campaigns between 2016 and 2018. We observe terminus morphology that was predominately overcut (52% in August 2016, 63% in May 2017 and 74% in September 2018), accompanied by high multibeam sonar-derived melt rates (4.84 m d−1 in 2016, 1.13 m d−1 in 2017 and 1.85 m d−1 in 2018). We find that periods of high subglacial discharge lead to localized undercut discharge outlets, but adjacent to these outlets the terminus maintains significantly overcut geometry, with an ice ramp that protrudes 75 m into the fjord in 2017 and 125 m in 2018. Our data challenge the assumption that tidewater glacier termini are largely undercut during periods of high submarine melting.
High-quality evidence from prospective longitudinal studies in humans is essential to testing hypotheses related to the developmental origins of health and disease. In this paper, the authors draw upon their own experiences leading birth cohorts with longitudinal follow-up into adulthood to describe specific challenges and lessons learned. Challenges are substantial and grow over time. Long-term funding is essential for study operations and critical to retaining study staff, who develop relationships with participants and hold important institutional knowledge and technical skill sets. To maintain contact, we recommend that cohorts apply multiple strategies for tracking and obtain as much high-quality contact information as possible before the child’s 18th birthday. To maximize engagement, we suggest that cohorts offer flexibility in visit timing, length, location, frequency, and type. Data collection may entail multiple modalities, even at a single collection timepoint, including measures that are self-reported, research-measured, and administrative with a mix of remote and in-person collection. Many topics highly relevant for adolescent and young adult health and well-being are considered to be private in nature, and their assessment requires sensitivity. To motivate ongoing participation, cohorts must work to understand participant barriers and motivators, share scientific findings, and provide appropriate compensation for participation. It is essential for cohorts to strive for broad representation including individuals from higher risk populations, not only among the participants but also the staff. Successful longitudinal follow-up of a study population ultimately requires flexibility, adaptability, appropriate incentives, and opportunities for feedback from participants.
Loneliness, a negative emotion stemming from the perception of unmet social needs, is a major public health concern. Current interventions often target social domains but produce small effects and are not as effective as established emotion regulation (ER)-based interventions for general psychological distress (i.e., depression/anxiety). Given that loneliness and distress are types of negative affect, we aimed to compare them within an ER framework by examining the amount of variance ER strategies accounted for in loneliness versus distress, and comparing the ER strategy profiles characterising them. Participants (N = 582, Mage = 22.31, 77.66% female) completed self-report measures of loneliness, distress, and use of 12 cognitive (e.g., cognitive reappraisal) or behavioural (e.g., expressive suppression) ER strategies. Regression analyses revealed that ER explained comparable variance in these constructs. Latent profile analysis identified seven profiles differing in ER patterns, with no distinct loneliness or distress profile identified. Rather, similar patterns of ER characterised these two constructs, involving the greater use of generally maladaptive strategies and the lesser use of generally adaptive strategies. However, loneliness was additionally characterised by less use of strategies involving social connection/expression. Overall, our study supports the utility of ER for understanding loneliness. Established ER-based frameworks/interventions for distress may have transdiagnostic utility in targeting loneliness.
The coronavirus disease 19 (COVID-19) pandemic has prompted concerns regarding increased suicide rates and exacerbation of underlying mental illness symptoms. •There is evidence suggesting neurocognitive changes as well as immune response in COVID-19 infection may increase a patient’s propensity for suicidal ideation. • Patients who are diagnosed with COVID-19 may be affected by psychological factors of anxiety, stress related to having this novel virus as well as depression, post-traumatic stress disorder and sleep disorders throughout treatment and post-treatment of continued concerns. •The combination of psychiatric, neurological, and physical symptoms associated with COVID-19 may elevate suicide risk
Objectives
We present a case of a female with no prior psychiatric history who impulsively attempted suicide after a recent COVID-19 diagnosis and subsequent quarantine. Will explore possible link between increase of suicidal ideation and COVID-19 infection.
Methods
A case report.
Results
Link between increase of suicidal ideation and COVID-19 infection has not been clearly established but there have been reports, as in our case, of the possible vulnerability to mental illness and new onset suicidal ideation that COVID-19 survivors may experience. It may be useful to screen all patients for depressive symptoms after a COVID-19 infection. Early identification and treatment of depression in recovered COVID-19 patients will help to improve psychological impact on COVID-19 survivors and potentially reduce suicide rates.
Conclusions
As COVID-19 infection may trigger new onset mental illness, exacerbate symptoms of underlying mental illness, and may increase suicidal ideation, further research is needed to evaluate links between COVID-19 infection and depression with suicidal ideation
Methicillin-resistant Staphylococcus aureus (MRSA) infection is highly unlikely when nasal-swab results are negative. We evaluated the impact of an electronic prompt regarding MRSA nasal screening on the length of vancomycin therapy for respiratory indications.
Design:
Retrospective, single-center cohort study.
Setting:
Tertiary-care academic medical center (Mayo Clinic) in Jacksonville, Florida.
Patients:
Eligible patients received empiric treatment with vancomycin for suspected or confirmed respiratory infections from January through April 2019 (preimplementation cohort) and from October 2019 through January 2020 (postimplementation cohort).
Intervention:
The electronic health system software was modified to provide a best-practice advisory (BPA) prompt to the pharmacist upon order verification of vancomycin for patients with suspected or confirmed respiratory indications. Pharmacists were prompted to order a MRSA nasal swab if it was not already ordered by the provider.
Methods:
We reviewed patient records to determine the time from vancomycin prescription to de-escalation. The secondary end point was incidence of acute kidney injury.
Results:
The study included 120 patients (preimplementation, n = 61; postimplementation, n = 59). Median time to de-escalation was significantly shorter for the postimplementation cohort: 76 hours (interquartile range [IQR], 52–109) versus 42 hours (IQR, 37–61; P = .002). Acute kidney injury occurred in 11 patients (18%) in the preimplementation cohort and in 3 patients (5%) in the postimplementation cohort (P = .01; number needed to treat, 8).
Conclusions:
Implementation of a BPA notification for MRSA nasal screening helped decrease the time to de-escalation of vancomycin.
Emotion dysregulation is cross-diagnostic and impairing. Most research has focused on dysregulated expressions of negative affect, often measured as irritability, which is associated with multiple forms of psychopathology and predicts negative outcomes. However, the Research Domain Criteria (RDoC) include both negative and positive valence systems. Emerging evidence suggests that dysregulated expressions of positive affect, or excitability, in early childhood predict later psychopathology and impairment above and beyond irritability. Typically, irritability declines from early through middle childhood; however, the developmental trajectory of excitability is unknown. The impact of excitability across childhood on later emotion dysregulation is also yet unknown. In a well-characterized, longitudinal sample of 129 children studied from ages 3 to 5.11 years through 14 to 19 years, enriched for early depression and disruptive symptoms, we assessed the trajectory of irritability and excitability using multilevel modeling and how components of these trajectories impact later emotion dysregulation. While irritability declines across childhood, excitability remains remarkably stable both within and across the group. Overall levels of excitability (excitability intercept) predict later emotion dysregulation as measured by parent and self-report and predict decreased functional magnetic resonance imaging activity in cognitive emotion regulation regions during an emotion regulation task. Irritability was not related to any dysregulation outcome above and beyond excitability.
The GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) is a radio continuum survey at 76–227 MHz of the entire southern sky (Declination $<\!{+}30^{\circ}$) with an angular resolution of ${\approx}2$ arcmin. In this paper, we combine GLEAM data with optical spectroscopy from the 6dF Galaxy Survey to construct a sample of 1 590 local (median $z \approx 0.064$) radio sources with $S_{200\,\mathrm{MHz}} > 55$ mJy across an area of ${\approx}16\,700\,\mathrm{deg}^{2}$. From the optical spectra, we identify the dominant physical process responsible for the radio emission from each galaxy: 73% are fuelled by an active galactic nucleus (AGN) and 27% by star formation. We present the local radio luminosity function for AGN and star-forming (SF) galaxies at 200 MHz and characterise the typical radio spectra of these two populations between 76 MHz and ${\sim}1$ GHz. For the AGN, the median spectral index between 200 MHz and ${\sim}1$ GHz, $\alpha_{\mathrm{high}}$, is $-0.600 \pm 0.010$ (where $S \propto \nu^{\alpha}$) and the median spectral index within the GLEAM band, $\alpha_{\mathrm{low}}$, is $-0.704 \pm 0.011$. For the SF galaxies, the median value of $\alpha_{\mathrm{high}}$ is $-0.650 \pm 0.010$ and the median value of $\alpha_{\mathrm{low}}$ is $-0.596 \pm 0.015$. Among the AGN population, flat-spectrum sources are more common at lower radio luminosity, suggesting the existence of a significant population of weak radio AGN that remain core-dominated even at low frequencies. However, around 4% of local radio AGN have ultra-steep radio spectra at low frequencies ($\alpha_{\mathrm{low}} < -1.2$). These ultra-steep-spectrum sources span a wide range in radio luminosity, and further work is needed to clarify their nature.
Population reductions in Na intake could prevent hypertension, and current guidelines recommend that clinicians advise patients to reduce intake. This study aimed to estimate the prevalence of taking action and receiving advice from a health professional to reduce Na intake in ten US jurisdictions, including the first-ever data in New York state and Guam.
Design:
Weighted prevalence and 95 % CI overall and by location, demographic group, health status and receipt of provider advice using self-reported data from the 2017 Behavioral Risk Factor Surveillance System optional Na module.
Setting:
Seven states, the District of Columbia, Puerto Rico and Guam.
Participants:
Adults aged ≥ 18 years.
Results:
Overall, 53·6 % (95 % CI 52·7, 54·5) of adults reported taking action to reduce Na intake, including 54·8 % (95 % CI 52·8, 56·7) in New York and 61·2 % (95 % CI 57·6, 64·7) in Guam. Prevalence varied by demographic and health characteristic and was higher among adults who reported having hypertension (72·5 %; 95 % CI 71·2, 73·7) v. those who did not report having hypertension (43·9 %; 95 % CI 42·7, 45·0). Among those who reported receiving Na reduction advice from a health professional, 82·6 % (95 % CI 81·3, 83·9) reported action v. 44·4 % (95 % CI 43·4, 45·5) among those who did not receive advice. However, only 24·0 % (95 % CI 23·3, 24·7) of adults reported receiving advice from a health professional to reduce Na intake.
Conclusions:
The majority of adults report taking action to reduce Na intake. Results highlight an opportunity to increase Na reduction advice from health professionals during clinical visits to better align with existing guidelines.
People with CHD are at increased risk for executive functioning deficits. Meta-analyses of these measures in CHD patients compared to healthy controls have not been reported.
Objective:
To examine differences in executive functions in individuals with CHD compared to healthy controls.
Data sources:
We performed a systematic review of publications from 1 January, 1986 to 15 June, 2020 indexed in PubMed, CINAHL, EMBASE, PsycInfo, Web of Science, and the Cochrane Library.
Study selection:
Inclusion criteria were (1) studies containing at least one executive function measure; (2) participants were over the age of three.
Data extraction:
Data extraction and quality assessment were performed independently by two authors. We used a shifting unit-of-analysis approach and pooled data using a random effects model.
Results:
The search yielded 61,217 results. Twenty-eight studies met criteria. A total of 7789 people with CHD were compared with 8187 healthy controls. We found the following standardised mean differences: −0.628 (−0.726, −0.531) for cognitive flexibility and set shifting, −0.469 (−0.606, −0.333) for inhibition, −0.369 (−0.466, −0.273) for working memory, −0.334 (−0.546, −0.121) for planning/problem solving, −0.361 (−0.576, −0.147) for summary measures, and −0.444 (−0.614, −0.274) for reporter-based measures (p < 0.001).
Limitations:
Our analysis consisted of cross-sectional and observational studies. We could not quantify the effect of collinearity.
Conclusions:
Individuals with CHD appear to have at least moderate deficits in executive functions. Given the growing population of people with CHD, more attention should be devoted to identifying executive dysfunction in this vulnerable group.