We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Current clinical guidelines for people at risk of heart disease in Australia recommend nutrition intervention in conjunction with pharmacotherapy(1). However, Australians living in rural and remote regions have less access to medical nutritional therapy (MNT) provided by Accredited Practising Dietitians (APDs) than their urban counterparts(2). The aim of the HealthyRHearts study was to trial the delivery of MNT by APDs using telehealth to eligible patients of General Practitioners (GPs) located in small to large rural towns in the Hunter New England region(3) of New South Wales, Australia. The study design was a 12-month pragmatic randomised controlled trial. The key outcome was reduced total cholesterol. The study was place-based, meaning many of the research team and APDs were based rurally, to ensure the context of the GPs and patients was already known. Eligible participants were those assessed as moderate-to-high risk of CVD by their GP. People in the intervention group received five MNT consults (totalling two hours) delivered via telehealth by APDs, and also answered a personalised nutrition questionnaire to guide their priorities and to support personalised dietary behaviour change during the counselling. Both intervention and control groups received usual care from their GP and were provided access to the Australian Eating Survey (Heart version), a 242-item online food frequency questionnaire with technology-supported personalised nutrition reports that evaluated intake relative to heart healthy eating principles. Of the 192 people who consented to participate, 132 were eligible due to their moderate-to-high risk. Pre-post participant medication use with a registered indication(4) for hypercholesterolemia, hypertension and glycemic control were documented according to class and strength (defined daily dose: DDD)(5). Nine GP practices (with 91 participants recruited) were randomised to the intervention group and seven practices (41 participants) were randomised to control. Intervention participants attended 4.3 ± 1.4 out of 5 dietetic consultations offered. Of the132 people with baseline clinical chemistry, 103 also provided a 12-month sample. Mean total cholesterol at baseline was 4.97 ± 1.13 mmol/L for both groups, with 12-m reduction of 0.26 ± 0.77 for intervention and 0.28 ± 0.79 for control (p = 0.90, unadjusted value). Median (IQR) number of medications for the intervention group was 2 (1–3) at both baseline and 12 months (p = 0.78) with 2 (1–3) and 3 (2–3) for the control group respectively. Combined DDD of all medications was 2.1 (0.5–3.8) and 2.5 (0.75–4.4) at baseline and 12 months (p = 0.77) for the intervention group and 2.7 (1.5–4.0) and 3.0 (2.0–4.5) for the control group (p = 0.30). Results suggest that medications were a significant contributor to the management of total cholesterol. Further analysis is required to evaluate changes in total cholesterol attributable to medication prescription relative to the MNT counselling received by the intervention group.
Traditional foods are increasingly being incorporated into modern diets. This is largely driven by consumers seeking alternative food sources that have superior nutritional and functional properties. Within Australia, Aboriginal and Torres Strait Islander peoples are looking to develop their traditional foods for commercial markets. However, supporting evidence to suggest these foods are safe for consumption within the wider general population is limited. At the 2022 NSA conference a keynote presentation titled ‘Decolonising food regulatory frameworks to facilitate First Peoples food sovereignty’ was presented. This presentation was followed by a manuscript titled ‘Decolonising food regulatory frameworks: Importance of recognising traditional culture when assessing dietary safety of traditional foods’, which was published in the conference proceedings journal(1). These pieces examined the current regulatory frameworks that are used to assess traditional foods and proposed a way forward that would allow Traditional Custodians to successfully develop their foods for modern markets. Building upon the previously highlighted works, this presentation will showcase best practice Indigenous engagement and collaboration principles in the development of traditionally used food products. To achieve this, we collaborated with a collective of Gamilaraay peoples who are looking to reignite their traditional grain practices and develop grain-based food products. To meet the current food safety regulatory requirements, we needed to understand how this grain would fit into modern diets, which included understanding the history of use, elucidating the nutritional and functional properties that can be attributed to the grain, and developing a safety dossier(2) so that the Traditional Custodians can confidently take their product to market. To aid the Traditional Custodians in performing their due diligence, we have systemically analysed the dietary safety of the selected native grain and compared it side-by-side with commonly consumed wheat in a range of in vitro bioassays and chemical analyses. From a food safety perspective, we show that the native grain is equivalent to commonly consumed wheat. The native grain has been shown to be no more toxic than wheat within our biological screening systems. Chemical analysis showed that the level of contaminants are below tolerable limits, and we were not able to identify any chemical classes of concern. Our initial findings support the history of safe use and suggest that the tested native grain species would be no less safe than commonly consumed wheat. This risk assessment and previously published nutritional study(3) provides an overall indication that the grain is nutritionally superior and viable for commercial development. The learnings from this project can direct the future risk assessment of traditional foods and therefore facilitate the safe market access of a broader range of traditionally used foods. Importantly, the methods presented are culturally safe and financially viable for the small businesses hoping to enter the market.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Since cannabis was legalized in Canada in 2018, its use among older adults has increased. Although cannabis may exacerbate cognitive impairment, there are few studies on its use among older adults being evaluated for cognitive disorders.
Methods:
We analyzed data from 238 patients who attended a cognitive clinic between 2019 and 2023 and provided data on cannabis use. Health professionals collected information using a standardized case report form.
Results:
Cannabis use was reported by 23 out of 238 patients (9.7%): 12 took cannabis for recreation, 8 for medicinal purposes and 3 for both purposes. Compared to non-users, cannabis users were younger (mean ± SD 62.0 ± 7.5 vs 68.9 ± 9.5 years; p = 0.001), more likely to have a mood disorder (p < 0.05) and be current or former cigarette smokers (p < 0.05). There were no significant differences in sex, race or education. The proportion with dementia compared with pre-dementia cognitive states did not differ significantly in users compared with non-users. Cognitive test scores were similar in users compared with non-users (Montreal Cognitive Assessment: 20.4 ± 5.0 vs 20.7 ± 4.5, p = 0.81; Folstein Mini-Mental Status Exam: 24.5 ± 5.1 vs 26.0 ± 3.6, p = 0.25). The prevalence of insomnia, obstructive sleep apnea, anxiety disorders, alcohol use or psychotic disorders did not differ significantly.
Conclusion:
The prevalence of cannabis use among patients with cognitive concerns in this study was similar to the general Canadian population aged 65 and older. Further research is necessary to investigate patients’ motivations for use and explore the relationship between cannabis use and mood disorders and cognitive decline.
Narcolepsy is a chronic neurological disorder characterized by excessive daytime sleepiness (EDS), among other symptoms. Previous studies of narcolepsy have largely relied on quantitative methods, providing limited insight into the patient experience. This study used qualitative interviews to better understand this rare condition.
Methods
Patients with narcolepsy (types 1 [NT1] and 2 [NT2]) were recruited using convenience and snowball sampling. Trained qualitative researchers conducted hour-long, individual interviews. Interview transcripts were coded and thematically analyzed using inductive and deductive approaches.
Results
Twenty-two adults with narcolepsy (NT1=12; NT2=10) participated (average age: NT1=35; NT2=44). Most were female (NT1=83%; NT2=70%) and white (NT1=75%; NT2=60%). Average times since diagnosis were 7 years (NT1) and 11 years (NT2).
At disease onset, symptoms experienced included EDS (NT1=83%; NT2=80%)—sometimes involving sleep attacks (NT1=35%; NT2=50%)—fatigue (NT1=42%; NT2=30%), oversleeping (NT1=33%; NT2=20%), and cataplexy (NT1=42%). Participants sought a diagnosis from healthcare professionals including sleep specialists, neurologists, pulmonologists, psychiatrists, and primary care physicians. Many participants reported receiving a narcolepsy diagnosis >10 years after symptom onset (NT1=50%; NT2=60%). During that time, patients reported misdiagnoses, including depression, sleep apnea, and attention-deficit/hyperactivity disorder.
Common symptoms included EDS (NT1=100%; NT2=90%), cognitive impairment (NT1=92%; NT2=100%), and fatigue (NT1=75%; NT2=90%). All participants with NT1 reported cataplexy. Participants rated these symptoms as among the most bothersome.
Conclusions
Study results provide descriptions of narcolepsy symptoms and the often challenging journey toward seeking a diagnosis. By using patient-centered, qualitative methods, this study fills a gap by providing additional insights into the patient experience of narcolepsy.
Formulas are derived by which, given the factor loadings and the internal reliability of a test of unit length, the following estimates can be made: (1) the common-factor loadings for a similar (homogeneous) test of length n; (2) the number of times (n) that a test needs to be lengthened homogeneously to achieve a factor loading of a desired magnitude; and (3) the correlation between two tests, either or both of which have been altered in length, as a function of (a) the new factor loadings in the altered tests or (b) the original loadings in the unit-length tests. The appropriate use of the derived formulas depends upon the fulfillment of four assumptions enumerated.
Two current methods of deriving common-factor scores from tests are briefly examined and rejected. One of these estimates a score from a multiple-regression equation with as many terms as there are tests in the battery. The other limits the equation to a few tests heavily saturated with the desired factor, with or without tests used to suppress the undesired factors. In the proposed methods, the single best test for each common factor is the starting point. Such a test ordinarily has a very few undesired factors to be suppressed, frequently only one. The suppression test should be univocal, or nearly so. Fortunately, there are relatively univocal tests for factors that commonly require suppression. Equations are offered by which the desired-factor test and a single suppression test can be weighted in order to achieve one or more objectives. Among the objectives are (1) maximizing the desired factor variance, (2) minimizing the undesired factor variance, (3) a compromise, in which the undesired variance is materially reduced without loss in desired variance, and (4) a change to any selected ratio of desired to undesired variance. A more generalized solution is also suggested. The methods can be extended in part to the suppression of more than one factor. Equations are derived for the suppression of two factors.
Head and neck squamous cell carcinomas (HNSCCs) are aggressive tumours lacking a standardised timeline for treatment initiation post-diagnosis. Delays beyond 60 days are linked to poorer outcomes and higher recurrence risk.
Methods:
A retrospective review was conducted on patients over 18 with HNSCC treated with (chemo)radiation at a rural tertiary care centre (September 2020–2022). Data on patient demographics, oncologic characteristics, treatment details and delay causes were analysed using SPSS.
Results:
Out of 93 patients, 35.5% experienced treatment initiation delays (TTIs) over 60 days. Median TTI was 73 days for delayed cases, compared to 41.5 days otherwise. No significant differences in demographics or cancer characteristics were observed between groups. The primary reasons for the delay were care coordination (69.7%) and patient factors (18.2%). AJCC cancer stage showed a trend towards longer delays in advanced stages.
Conclusion:
One-third of patients faced delayed TTI, primarily due to care coordination and lack of social support. These findings highlight the need for improved multidisciplinary communication and patient support mechanisms, suggesting potential areas for quality improvement in HNSCC treatment management.
The adipofascial anterolateral thigh (AF-ALT) free flap represents a versatile technique in head and neck reconstructions, with its applications increasingly broadening. The objective was to detail the novel utilization of the AF-ALT flap in orbital and skull base reconstruction, along with salvage laryngectomy onlay in our case series.
Method
We conducted a retrospective analysis at Roswell Park Comprehensive Cancer Center, spanning from July 2019 to June 2023, focusing on patient demographics and reconstructive parameters data.
Results
The AF-ALT flap was successfully employed in eight patients (average age 59, body mass index [BMI] 32.0) to repair various defects. Noteworthy outcomes were observed in skull base reconstructions, with no flap failures or major complications over an average 12-month follow-up. Donor sites typically healed well with minimal interventions.
Conclusion
Our series is the first to report the AF-ALT flap's efficacy in anterior skull base and orbital reconstructions, demonstrating an additional innovation in complex head and neck surgeries.
Cancer patients are among the most vulnerable populations during and after a disaster. We evaluated the impact of treatment interruption on the survival of women with gynecologic cancer in Puerto Rico following Hurricanes Irma and María.
Methods:
A retrospective cohort study among a clinic-based sample of women with gynecological cancer diagnosed between January 2016 and September 2017 (n = 112) was done. Women were followed from their diagnosis until December 2019, to assess vital status. Kaplan-Meier survival curves and Cox proportional hazards models were performed.
Results:
Mean age was 56 (± 12.3) years; corpus uteri (58.9%) was the most common gynecologic cancer. Predominant treatments were surgery (91.1%) and chemotherapy (44.6%). Overall, 75.9% were receiving treatment before the hurricanes, 16.1% experienced treatment interruptions, and 8.9% died during the follow-up period. Factors associated with treatment interruption in bivariate analysis included younger age (≤55 years), having regional/distant disease, and receiving > 1 cancer treatment (P < 0.05). Crude analysis revealed an increased risk of death among women with treatment interruption (HR: 3.88, 95% CI: 1.09-13.77), persisting after adjusting for age and cancer stage (HR: 2.49, 95% CI: 0.69-9.01).
Conclusions:
Findings underscore the detrimental impact of treatment interruption on cancer survival in the aftermath of hurricanes, emphasizing the need for emergency response plans for this vulnerable population.
In decision-making, especially for sustainability, choosing the right assessment tools is crucial but challenging due to the abundance of options. A new method is introduced to streamline this process, aiding policymakers and managers. This method involves four phases: scoping, cataloging, selection, and validation, combining data analysis with stakeholder engagement. Using the food system as an example, the approach demonstrates how practitioners can select tools effectively based on input variables and desired outcomes to address sustainability risks. This method can be applied across various sectors, offering a systematic way to enhance decision-making and manage sustainability effectively.
Technical Summary
Decision making frequently entails the selection and application of assessment tools. For sustainability decisions there are a plethora of tools available for environmental assessment, yet no established and clear approach to determine which tools are appropriate and resource efficient for application. Here we present an extensive inventory of tools and a novel taxonomic method which enables efficient, effective tool selection to improve decision making for policymakers and managers. The tool selection methodology follows four main phases based on the divergence-convergence logic; a scoping phase, cataloging phase, selection phase and validation phase. This approach combines elements of data-driven analysis with participatory techniques for stakeholder engagement to achieve buy-in and to ensure efficient management of progress and agile course correction when needed. It builds on the current limited range and scope of approaches to tool selection, and is flexible and Artificial Intelligence-ready in order to facilitate more rapid integration and uptake. Using the food system as a case study, we demonstrate how practitioners can use available input variables and desired output metrics to select the most appropriate tools to manage sustainability risks, with the approach having wide applicability to other sectors.
Social Media Summary
New method simplifies tool selection for sustainable decisions, aiding policymakers & managers. #Sustainability #DecisionMaking
Inhibitory control plays an important role in children’s cognitive and socioemotional development, including their psychopathology. It has been established that contextual factors such as socioeconomic status (SES) and parents’ psychopathology are associated with children’s inhibitory control. However, the relations between the neural correlates of inhibitory control and contextual factors have been rarely examined in longitudinal studies. In the present study, we used both event-related potential (ERP) components and time-frequency measures of inhibitory control to evaluate the neural pathways between contextual factors, including prenatal SES and maternal psychopathology, and children’s behavioral and emotional problems in a large sample of children (N = 560; 51.75% females; Mage = 7.13 years; Rangeage = 4–11 years). Results showed that theta power, which was positively predicted by prenatal SES and was negatively related to children’s externalizing problems, mediated the longitudinal and negative relation between them. ERP amplitudes and latencies did not mediate the longitudinal association between prenatal risk factors (i.e., prenatal SES and maternal psychopathology) and children’s internalizing and externalizing problems. Our findings increase our understanding of the neural pathways linking early risk factors to children’s psychopathology.
A series of synthetic goethites containing varying amounts of Si and P dopants were characterized by X-ray powder diffraction, electron diffraction, microbeam electron diffraction, and Mössbauer spectroscopy. Very low level incorporation produced materials having structural and spectral properties similar to those of poorly crystalline synthetic or natural goethite. At higher incorporation levels, mixtures of noncrystalline materials were obtained which exhibited Mössbauer spectra typical of noncrystalline materials mixed with a superparamagnetic component. Microbeam electron diffraction indicated that these mixtures contained poorly crystalline goethite, poorly crystalline ferrihydrite, and a noncrystalline component. If the material was prepared with no aging of the alkaline Fe3+ solution before the addition of Na2HPO4 or Na2SiO3, materials were obtained containing little if any superparamagnetic component. If the alkaline Fe3+ solution was aged for 48 hr before the addition, goethite nuclei formed and apparently promoted the precipitation of a superparamagnetic phase. The Mössbauer-effect hyperfme parameters and the saturation internal-hyperfine field obtained at 4.2 K were typical of those of goethite; however, the Mössbauer spectra indicated that the ordering temperature, as reflected in the relaxation rate and/or the blocking temperature, decreased with increasing incorporation of Si and P. The complete loss of crystallinity indicates that Si and P did not substitute for Fe, but rather adsorbed on crystal-growth sites, thereby preventing uniform crystal growth.
In the United States, all 50 states and the District of Columbia have Good Samaritan Laws (GSLs). Designed to encourage bystanders to aid at the scene of an emergency, GSLs generally limit the risk of civil tort liability if the care is rendered in good faith. Nation-wide, a leading cause of preventable death is uncontrolled external hemorrhage. Public bleeding control initiatives aim to train the public to recognize life-threatening external bleeding, perform life-sustaining interventions (including direct pressure, tourniquet application, and wound packing), and to promote access to bleeding control equipment to ensure a rapid response from bystanders.
Methods:
This study sought to identify the GSLs in each state and the District of Columbia to identify what type of responder is covered by the law (eg, all laypersons, only trained individuals, or only licensed health care providers) and if bleeding control is explicitly included or excluded in their Good Samaritan coverage.
Results:
Good Samaritan Laws providing civil liability qualified immunity were identified in all 50 states and the District of Columbia. One state, Oklahoma, specifically includes bleeding control in its GSLs. Six states – Connecticut, Illinois, Kansas, Kentucky, Michigan, and Missouri – have laws that define those covered under Good Samaritan immunity, generally limiting protection to individuals trained in a standard first aid or resuscitation course or health care clinicians. No state explicitly excludes bleeding control from their GSLs, and one state expressly includes it.
Conclusion:
Nation-wide across the United States, most states have broad bystander coverage within GSLs for emergency medical conditions of all types, including bleeding emergencies, and no state explicitly excludes bleeding control interventions. Some states restrict coverage to those health care personnel or bystanders who have completed a specific training program. Opportunity exists for additional research into those states whose GSLs may not be inclusive of bleeding control interventions.
People with schizophrenia on average are more socially isolated, lonelier, have more social cognitive impairment, and are less socially motivated than healthy individuals. People with bipolar disorder also have social isolation, though typically less than that seen in schizophrenia. We aimed to disentangle whether the social cognitive and social motivation impairments observed in schizophrenia are a specific feature of the clinical condition v. social isolation generally.
Methods
We compared four groups (clinically stable patients with schizophrenia or bipolar disorder, individuals drawn from the community with self-described social isolation, and a socially connected community control group) on loneliness, social cognition, and approach and avoidance social motivation.
Results
Individuals with schizophrenia (n = 72) showed intermediate levels of social isolation, loneliness, and social approach motivation between the isolated (n = 96) and connected control (n = 55) groups. However, they showed significant deficits in social cognition compared to both community groups. Individuals with bipolar disorder (n = 48) were intermediate between isolated and control groups for loneliness and social approach. They did not show deficits on social cognition tasks. Both clinical groups had higher social avoidance than both community groups
Conclusions
The results suggest that social cognitive deficits in schizophrenia, and high social avoidance motivation in both schizophrenia and bipolar disorder, are distinct features of the clinical conditions and not byproducts of social isolation. In contrast, differences between clinical and control groups on levels of loneliness and social approach motivation were congruent with the groups' degree of social isolation.
Traumatic brain injury is one of several recognized risk factors for cognitive decline and neurodegenerative disease. Currently, risk scores involving modifiable risk/protective factors for dementia have not incorporated head injury history as part of their overall weighted risk calculation. We investigated the association between the LIfestyle for BRAin Health (LIBRA) risk score with odds of mild cognitive impairment (MCI) diagnosis and cognitive function in older former National Football League (NFL) players, both with and without the influence of concussion history.
Participants and Methods:
Former NFL players, ages ≥ 50 (N=1050; mean age=61.1±5.4-years), completed a general health survey including self-reported medical history and ratings of function across several domains. LIBRA factors (weighted value) included cardiovascular disease (+1.0), hypertension (+1.6), hyperlipidemia (+1.4), diabetes (+1.3), kidney disease (+1.1), cigarette use history (+1.5), obesity (+1.6), depression (+2.1), social/cognitive activity (-3.2), physical inactivity (+1.1), low/moderate alcohol use (-1.0), healthy diet (-1.7). Within Group 1 (n=761), logistic regression models assessed the association of LIBRA scores and independent contribution of concussion history with the odds of MCI diagnosis. A modified-LIBRA score incorporated concussion history at the level planned contrasts showed significant associations across concussion history groups (0, 1-2, 3-5, 6-9, 10+). The weighted value for concussion history (+1.9) within the modified-LIBRA score was based on its proportional contribution to dementia relative to other LIBRA risk factors, as proposed by the 2020 Lancet Commission Report on Dementia Prevention. Associations of the modified-LIBRA score with odds of MCI and cognitive function were assessed via logistic and linear regression, respectively, in a subset of the sample (Group 2; n=289) who also completed the Brief Test of Adult Cognition by Telephone (BTACT). Race was included as a covariate in all models.
Results:
The median LIBRA score in the Group 1 was 1.6(IQR= -1, 3.6). Standard and modified-LIBRA median scores were 1.1(IQR= -1.3, 3.3) and 2(IQR= -0.4, 4.6), respectively, within Group 2. In Group 1, LIBRA score was significantly associated with odds of MCI diagnosis (odds ratio[95% confidence interval]=1.27[1.19, 1.28], p <.001). Concussion history provided additional information beyond LIBRA scores and was independently associated with odds of MCI; specifically, odds of MCI were higher among those with 6-9 (Odds Ratio[95% confidence interval]; OR=2.54[1.21, 5.32], p<.001), and 10+ (OR=4.55;[2.21, 9.36], p<.001) concussions, compared with those with no prior concussions. Within Group 2, the modified-LIBRA score was associated with higher odds of MCI (OR=1.61[1.15, 2.25]), and incrementally improved model information (0.04 increase in Nagelkerke R2) above standard LIBRA scores in the same model. Modified-LIBRA scores were inversely associated with BTACT Executive Function (B=-0.53[0.08], p=.002) and Episodic Memory scores (B=-0.53[0.08], p=.002).
Conclusions:
Numerous modifiable risk/protective factors for dementia are reported in former professional football players, but incorporating concussion history may aid the multifactorial appraisal of cognitive decline risk and identification of areas for prevention and intervention. Integration of multi-modal biomarkers will advance this person-centered, holistic approach toward dementia reduction, detection, and intervention.
It has been posited that alcohol use may confound the association between greater concussion history and poorer neurobehavioral functioning. However, while greater alcohol use is positively correlated with neurobehavioral difficulties, the association between alcohol use and concussion history is not well understood. Therefore, this study investigated the cross-sectional and longitudinal associations between cumulative concussion history, years of contact sport participation, and health-related/psychological factors with alcohol use in former professional football players across multiple decades.
Participants and Methods:
Former professional American football players completed general health questionnaires in 2001 and 2019, including demographic information, football history, concussion/medical history, and health-related/psychological functioning. Alcohol use frequency and amount was reported for three timepoints: during professional career (collected retrospectively in 2001), 2001, and 2019. During professional career and 2001 alcohol use frequency included none, 1-2, 3-4, 5-7 days/week, while amount included none, 12, 3-5, 6-7, 8+ drinks/occasion. For 2019, frequency included never, monthly or less, 2-4 times/month, 2-3 times/week, >4 times/week, while amount included none, 1-2, 3-4, 5-6, 7-9, 10+ drinks/occasion. Scores on a screening measure for Alcohol Use Disorder (CAGE) were also available at during professional career and 2001 timepoints. Concussion history was recorded in 2001 and binned into five groups: 0, 1-2, 3-5, 6-9, 10+. Depression and pain interference were assessed via PROMIS measures at all timepoints. Sleep disturbance was assessed in 2001 via separate instrument and with PROMIS Sleep Disturbance in 2019. Spearman’s rho correlations tested associations between concussion history and years of sport participation with alcohol use across timepoints, and whether poor health functioning (depression, pain interference, sleep disturbance) in 2001 and 2019 were associated with alcohol use both within and between timepoints.
Results:
Among the 351 participants (Mage=47.86[SD=10.18] in 2001), there were no significant associations between concussion history or years of contact sport participation with CAGE scores or alcohol use frequency/amount during professional career, 2001, or 2019 (rhos=-.072-.067, ps>.05). In 2001, greater depressive symptomology and sleep disturbance were related to higher CAGE scores (rho=.209, p<.001; rho=.176, p<.001, respectively), while greater depressive symptomology, pain interference, and sleep disturbance were related to higher alcohol use frequency (rho=.176, p=.002; rho=.109, p=.045; rho=.132, p=.013, respectively) and amount/occasion (rho=.215, p<.001; rho=.127, p=.020; rho=.153, p=.004, respectively). In 2019, depressive symptomology, pain interference, and sleep disturbance were not related to alcohol use (rhos=-.047-.087, ps>.05). Between timepoints, more sleep disturbance in 2001 was associated with higher alcohol amount/occasion in 2019 (rho=.115, p=.036).
Conclusions:
Increased alcohol intake has been theorized to be a consequence of greater concussion history, and as such, thought to confound associations between concussion history and neurobehavioral function later in life. Our findings indicate concussion history and years of contact sport participation were not significantly associated with alcohol use cross-sectionally or longitudinally, regardless of alcohol use characterization. While higher levels of depression, pain interference, and sleep disturbance in 2001 were related to greater alcohol use in 2001, they were not associated cross-sectionally in 2019. Results support the need to concurrently address health-related and psychological factors in the implementation of alcohol use interventions for former NFL players, particularly earlier in the sport discontinuation timeline.