We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
Online ordering will be unavailable from 17:00 GMT on Friday, April 25 until 17:00 GMT on Sunday, April 27 due to maintenance. We apologise for the inconvenience.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objectives/Goals: Electronic health record (EHR)-based recruitment can facilitate participation in clinical trials, but is not a panacea to trial accrual challenges. We conducted a root cause analysis to identify EHR-based accrual barriers and facilitators in a pragmatic randomized trial of metformin for those with prostate cancer and glucose intolerance. Methods/Study Population: We quantitatively analyzed enrollment drop-offs among eligible patients who either did not complete a consent (with analysis of EHR-embedded consent process) or who completed a consent but were not enrolled (with analysis of EHR implementation of a Best Practice Alert). We summarized data from the EHR by eligibility, provider encounters, and alerts, and generated CONSORT diagrams and tables to trace the enrollment pathway. We supplemented quantitative findings with a thematic analysis of semi-structured individual interviews with eligible patients (n = 10) and study providers (n = 4) to identify systematic barriers to recruitment and enrollment of eligible patients. Results/Anticipated Results: CONSORT diagram analysis found that 24% of potentially eligible patients (268 of 1130) had an eligible study encounter but were not enrolled. Additionally, BPAs were not triggering for some eligible patients. Interviews revealed that study providers wanted more detailed information about which study arm their patient would be assigned to, and about next steps after enrollment, especially relating to additional lab tests and follow-up care needed. Patient interviews suggested that patients often did not remember completing the consent process and felt overwhelmed with appointments and information; patients expected providers to actively bring up research opportunities during appointments. Discussion/Significance of Impact: While pragmatic EHR-embedded trials are often characterized as lower-burden, these trials still require active engagement by providers, as well as ongoing attention from both research and informatics teams to ensure that EHR-embedded processes are functioning as designed, and that they are effective in recruiting study participants.
The stellar age and mass of galaxies have been suggested as the primary determinants for the dynamical state of galaxies, with environment seemingly playing no or only a very minor role. We use a sample of 77 galaxies at intermediate redshift ($z\sim0.3$) in the Middle-Ages Galaxies Properties with Integral field spectroscopy (MAGPI) Survey to study the subtle impact of environment on galaxy dynamics. We use a combination of statistical techniques (simple and partial correlations and principal component analysis) to isolate the contribution of environment on galaxy dynamics, while explicitly accounting for known factors such as stellar age, star formation histories, and stellar masses. We consider these dynamical parameters: high-order kinematics of the line-of-sight velocity distribution (parametrised by the Gauss-Hermite coefficients $h_3$ and $h_4$), kinematic asymmetries $V_{\textrm{asym}}$ derived using kinemetry, and the observational spin parameter proxy $\lambda_{R_e}$. Of these, the mean $h_4$ is the only parameter found to have a significant correlation with environment as parametrised by group dynamical mass. This correlation exists even after accounting for age and stellar mass trends. We also find that satellite and central galaxies exhibit distinct dynamical behaviours, suggesting they are dynamically distinct classes. Finally, we confirm that variations in the spin parameter $\lambda_{R_e}$ are most strongly (anti-)correlated with age as seen in local studies, and show that this dependence is well-established by $z\sim0.3$.
Australian children fall short of national dietary guidelines with only 63% consuming adequate fruit and 10% enough vegetables. Before school care operates as part of Out of School Hours Care (OSHC) services and provides opportunities to address poor dietary habits in children. The aim of this study was to describe the food and beverages provided in before school care and to explore how service-level factors influence food provision.
Design:
A cross-sectional study was conducted in OSHC services. Services had their before school care visited twice between March and June 2021. Direct observation was used to capture food and beverage provision, and child and staff behaviour during breakfast. Interviews with staff collected information on service characteristics. Foods were categorised using the Australian Dietary Guidelines, and frequencies calculated. Fishers Exact Test was used to compare food provision with service characteristics.
Setting:
The before school care of OSHC services in New South Wales, Australia.
Participants:
25 OSHC services.
Results:
Fruit was provided on 22% (n=11) of days and vegetables on 12% (n=6). Services with nutrition policies containing specific language on food provision (i.e. measurable) were more likely to provide fruit compared to those with policies using non-specific language (p = 0.027). Services that reported receiving training in healthy eating provided more vegetables than those who had not received training (p = 0.037).
Conclusions:
Before school care can be supported to improve food provision through staff professional development and advocating regulatory bodies for increased specificity requirements in the nutrition policies of service providers.
This work presents visual morphological and dynamical classifications for 637 spatially resolved galaxies, most of which are at intermediate redshift (z ∼ 0.3), in the Middle-Ages Galaxy Properties with Integral field spectroscopy (MAGPI) Survey. For each galaxy, we obtain a minimum of 11 independent visual classifications by knowledgeable classifiers. We use an extension of the standard Dawid-Skene Bayesian model introducing classifier-specific confidence parameters and galaxy-specific difficulty parameters to quantify classifier confidence and infer reliable statistical confidence estimates. Selecting sub-samples of 86 bright (r < 20 mag) high-confidence (> 0.98) morphological classifications at redshifts (0.2 ≤ z ≤ 0.4), we confirm the full range of morphological types is represented in MAGPI as intended in the survey design. Similarly, with a sub-sample of 82 bright high-confidence stellar kinematic classifications, we find that the rotating and non-rotating galaxies seen at low redshift are already in place at intermediate redshifts. We do not find evidence that the kinematic morphology-density relation seen at z ∼ 0 is established at z ∼ 0.3. We suggest that galaxies without obvious stellar rotation are dynamically pre-processed sometime before z ∼ 0.3 within lower mass groups before joining denser environments.
In China, low levels of early childhood development (ECD) in rural areas may inhibit economic development as the nation attempts to transition from a middle-income manufacturing-based economy to a high-income innovation economy. This paper surveys the recent literature on ECD among children ages 0-3 years in rural China, including rates of developmental delays, causes of delays, and implications for the future of China's economy. Recent studies have found high rates of developmental delays among young children in rural China and point to poor nutrition and psychosocial stimulation as the primary causes. This review highlights the need for large-scale ECD interventions in rural China to raise human capital and support future economic growth.
The rising incidence of neurodegenerative diseases in an ageing global population has shifted research focus towards modifiable risk factors, such as diet. Despite potential links between dietary patterns and brain health, inconsistencies in neuroimaging outcomes underscore a gap in understanding how diet impacts brain ageing. This study explores the relationship between three dietary patterns – Mediterranean, Dietary Approaches to Stop Hypertension (DASH) and Mediterranean-DASH Intervention for Neurodegenerative Delay – and cognitive outcomes as well as brain connectivity. The study aimed to assess the association of these diets with brain structure and cognitive function, involving a middle-aged healthy group and an older cohort with subjective cognitive decline. The study included cognitive assessments and diffusion-weighted MRI data to analyse white matter microstructural integrity. Participants comprised fifty-five older individuals with subjective cognitive decline (54·5 % female, mean age = 64) and fifty-two healthy middle-aged individuals (48·1 % female, mean age = 53). Age inversely correlated with certain cognitive functions and global brain metrics, across both cohorts. Adherence to the Mediterranean, DASH and Mediterranean-DASH Intervention for Neurodegenerative Delay diets showed no significant cognitive or global brain metric improvements after adjusting for covariates (age, education, BMI). Network-based statistics analysis revealed differences in brain subnetworks based on DASH diet adherence levels in the subjective cognitive decline cohort. In the healthy cohort, lower white matter connectivity was associated with reduced adherence to Mediterranean-DASH Intervention for Neurodegenerative Delay and DASH diets. Ultimately, the study found no strong evidence connecting dietary patterns to cognitive or brain connectivity outcomes. Future research should focus on longitudinal studies and refine dietary assessments.
Impulsivity is a multidimensional trait associated with substance use disorders (SUDs), but the relationship between distinct impulsivity facets and stages of substance use involvement remains unclear.
Methods
We used genomic structural equation modeling and genome-wide association studies (N = 79,729–903,147) to examine the latent genetic architecture of nine impulsivity traits and seven substance use (SU) and SUD traits.
Results
We found that the SU and SUD factors were strongly genetically inter-correlated (rG=0.77) but their associations with impulsivity facets differed. Lack of premeditation, negative and positive urgency were equally positively genetically correlated with both the SU (rG=.0.30–0.50) and SUD (rG=0.38–0.46) factors; sensation seeking was more strongly genetically correlated with the SU factor (rG=0.27 versus rG=0.10); delay discounting was more strongly genetically correlated with the SUD factor (rG=0.31 versus rG=0.21); and lack of perseverance was only weakly genetically correlated with the SU factor (rG=0.10). After controlling for the genetic correlation between SU/SUD, we found that lack of premeditation was independently genetically associated with both the SU (β=0.42) and SUD factors (β=0.21); sensation seeking and positive urgency were independently genetically associated with the SU factor (β=0.48, β=0.33, respectively); and negative urgency and delay discounting were independently genetically associated with the SUD factor (β=0.33, β=0.36, respectively).
Conclusions
Our findings show that specific impulsivity facets confer risk for distinct stages of substance use involvement, with potential implications for SUDs prevention and treatment.
To describe the real-world clinical impact of a commercially available plasma cell-free DNA metagenomic next-generation sequencing assay, the Karius test (KT).
Methods:
We retrospectively evaluated the clinical impact of KT by clinical panel adjudication. Descriptive statistics were used to study associations of diagnostic indications, host characteristics, and KT-generated microbiologic patterns with the clinical impact of KT. Multivariable logistic regression modeling was used to further characterize predictors of higher positive clinical impact.
Results:
We evaluated 1000 unique clinical cases of KT from 941 patients between January 1, 2017–August 31, 2023. The cohort included adult (70%) and pediatric (30%) patients. The overall clinical impact of KT was positive in 16%, negative in 2%, and no clinical impact in 82% of the cases. Among adult patients, multivariable logistic regression modeling showed that culture-negative endocarditis (OR 2.3; 95% CI, 1.11–4.53; P .022) and concern for fastidious/zoonotic/vector-borne pathogens (OR 2.1; 95% CI, 1.11–3.76; P .019) were associated with positive clinical impact of KT. Host immunocompromised status was not reliably associated with a positive clinical impact of KT (OR 1.03; 95% CI, 0.83–1.29; P .7806). No significant predictors of KT clinical impact were found in pediatric patients. Microbiologic result pattern was also a significant predictor of impact.
Conclusions:
Our study highlights that despite the positive clinical impact of KT in select situations, most testing results had no clinical impact. We also confirm diagnostic indications where KT may have the highest yield, thereby generating tools for diagnostic stewardship.
Interviews with 22 home-based primary care (HBPC) clinicians revealed that infectious disease physicians and clinical pharmacists facilitate infection management and antibiotic selection, respectively, and that local initiatives within programs support antibiotic prescribing decisions. Interventions that facilitate specialist engagement and tailored approaches that address the unique challenges of HBPC are needed.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
We present spectroscopic properties of 22 Ly$\alpha$ emitters (LAEs) at $z = 5.5 - 6.6$ with Ly$\alpha$ luminosity $\mathrm{log}( L_{\mathrm{Ly}\alpha} \, [\mathrm{erg} \, \mathrm{s}^{-1}]) = 42.4 - 43.5 $, obtained using VLT/MUSE as part of the Middle Ages Galaxy Properties with Integral Field Spectroscopy (MAGPI) survey. Additionally, we incorporate broad-band photometric data from the Subaru Hyper Suprime-Cam (HSC) Wide layer for 17 LAEs in our sample. The HSC-y band magnitudes show that our LAEs are UV-bright, with rest-frame absolute UV magnitudes $ -19.74 \leq \mathrm{M}_{\mathrm{UV}} \leq -23.27$. We find that the Ly$\alpha$ line width increases with Ly$\alpha$ luminosity, and this trend becomes more prominent at $z \gt 6$ where Ly$\alpha$ lines become significantly broadened ($\gtrsim+260 \, \mathrm{km}\, \mathrm{s}^{-1}$) at luminosities $\mathrm{log}( L_{\mathrm{Ly}\alpha} \, [\mathrm{erg} \, \mathrm{s}^{-1}]) \gt 43 $. This broadening is consistent with previous studies, suggesting that these sources are located inside larger ionised bubbles. We observe a slightly elevated ionising photon production efficiency estimated for LAEs at $z \gt 6$, which indicates that younger galaxies could be producing more ionising photons per UV luminosity. A tentative anti-correlation between ionising photon production efficiency and Ly$\alpha$ rest-frame equivalent width is noticed, which could indicate a time delay between production and escape of ionising photon primarily due to supernovae activity. Furthermore, we find a positive correlation between radius of ionised regions and Ly$\alpha$ line width, which again suggests that large ionised bubbles are created around these LAEs, which are allowing them to self-shield from the scattering effects of the intergalactic medium (IGM). We also detect two very closely separated LAEs at $z = 6.046$ (projected spatial distance between the cores is 15.92 kpc). This is the LAE pair with the smallest separation ever discovered in the reionisation epoch. The size of their respective bubbles suggests that they likely sit inside a common large ionised region. Such a closely separated LAE pair increases the size of ionised bubble, potentially allowing a boosted transmission of Ly$\alpha$ through neutral IGM and also supports an accelerated reionisation scenario.
While there is evidence that long-chain n-3 PUFA supplementation benefits mood, the extent to which a single high dose of n-3 PUFA can induce acute mood effects has not been examined. The present study investigated whether a single dose of a DHA-rich powder affects self-reported mood in middle-aged males during elevated cognitive demand. In a randomised, double-blind, placebo-controlled trial with a balanced crossover design, twenty-nine healthy males (age M = 52.8 years, sd = 5.3) were administered a powder (in a meal) containing 4·74 g n-3 PUFA (DHA 4020 mg; EPA 720 mg) or placebo in random order on two different testing days separated by a washout period of 7 ± 3 d. Participants completed mood assessments before and after completing two cognitive test batteries at baseline and again 3·5–4·0 h following the consumption of the active treatment or placebo. While completion of the cognitive test batteries increased negative mood, differential effects for alertness (P = 0·008) and stress (P = 0·04) followed consumption of the DHA-rich powder compared with placebo. Although alertness declined when completing the cognitive batteries, it was higher following consumption of the DHA-rich powder compared with placebo (P = 0·006). Conversely, stress was lower following consumption of the DHA-rich powder relative to placebo, though this difference only approached significance (P = 0·05). Overall, results from this pilot study demonstrate that a single high dose of n-3 PUFA may deliver acute mood benefits following elevated cognitive demand in healthy middle-aged males.
Interstage monitoring programs for single ventricle disease have been developed to reduce morbidity and mortality. There is increased use of telemedicine and mobile application monitoring. It is unknown if there are disparities in use based on patient socio-demographic factors.
Methods:
We conducted a retrospective cohort study of patients enrolled in the single ventricle monitoring program and KidsHeart application at a single centre from 4/21/2021 to 12/31/2023. We investigated the association of socio-demographic factors with telemedicine usage, mobile application enrollment and usage. We assessed resource utilisation and weight changes by program era.
Results:
There were 94 children in the cohort. Patients with Norwood and ductal stent had higher mean telemedicine visits per month (1.8 visits, p = 0.004), without differences based on socio-demographic factors. There were differences in application enrollment with more Black patients enrolled compared to White patients (p = 0.016). There were less Hispanic patients enrolled than Non-Hispanic patients (p = 0.034). There were no Spaish speaking patient’s enrolled (p = 0.0015). There were no patients with maternal education of less than high school enrolled and all those with maternal education of advanced degree were enrolled (p = 0.0016). There was decreased mobile application use in those from neighbourhoods in the lowest income quartile. There were decreased emergency department visits with mobile application monitoring. Mean weight-for-age z-scores had increased from start to completion of the program in all eras.
Discussion:
Differences were seen in mobile application enrollment and usage based on socio-demographic factors. Further work is needed to ensure that all patients have access to mobile application usage.
Diagnostic criteria for major depressive disorder allow for heterogeneous symptom profiles but genetic analysis of major depressive symptoms has the potential to identify clinical and etiological subtypes. There are several challenges to integrating symptom data from genetically informative cohorts, such as sample size differences between clinical and community cohorts and various patterns of missing data.
Methods
We conducted genome-wide association studies of major depressive symptoms in three cohorts that were enriched for participants with a diagnosis of depression (Psychiatric Genomics Consortium, Australian Genetics of Depression Study, Generation Scotland) and three community cohorts who were not recruited on the basis of diagnosis (Avon Longitudinal Study of Parents and Children, Estonian Biobank, and UK Biobank). We fit a series of confirmatory factor models with factors that accounted for how symptom data was sampled and then compared alternative models with different symptom factors.
Results
The best fitting model had a distinct factor for Appetite/Weight symptoms and an additional measurement factor that accounted for the skip-structure in community cohorts (use of Depression and Anhedonia as gating symptoms).
Conclusion
The results show the importance of assessing the directionality of symptoms (such as hypersomnia versus insomnia) and of accounting for study and measurement design when meta-analyzing genetic association data.
If livestock at risk of poor welfare could be identified using a risk assessment tool, more targeted response strategies could be developed by enforcement agencies to facilitate early intervention, prompt welfare improvement and a decrease in reoffending. This study aimed to test the ability of an Animal Welfare Risk Assessment Tool (AWRAT) to identify livestock at risk of poor welfare in extensive farming systems in Australia. Following farm visits for welfare- and non-welfare-related reasons, participants completed a single welfare rating (WR) and an assessment using the AWRAT for the farm just visited. A novel algorithm was developed to generate an AWRAT-Risk Rating (AWRAT-RR) based on the AWRAT assessment. Using linear regression, the relationship between the AWRAT-RR and the WR was tested. The AWRAT was good at identifying farms with poor livestock welfare based on this preliminary testing. As the AWRAT relies upon observation, the intra- and inter-observer agreement were compared in an observation study. This included rating a set of photographs of farm features, on two occasions. Intra-observer reliability was good, with 83% of Intra-class Correlation Coefficients (ICCs) for observers ≥ 0.8. Inter-observer reliability was moderate with an ICC of 0.67. The AWRAT provides a structured framework to improve consistency in livestock welfare assessments. Further research is necessary to determine the AWRAT’s ability to identify livestock at risk of poor welfare by studying animal welfare incidents and reoffending over time.
The objective of this study was to identify factors more commonly observed on farms with poor livestock welfare compared to farms with good welfare. Potentially, these factors may be used to develop an animal welfare risk assessment tool (AWRAT) that could be used to identify livestock at risk of poor welfare. Identifying livestock at risk of poor welfare would facilitate early intervention and improve strategies to promptly resolve welfare issues. This study focuses on cattle, sheep and goats in non-dairy extensive farming systems in Australia. To assist with identifying potential risk factors, a survey was developed presenting 99 factors about the farm, farmers, animals and various aspects of management. Based on their experience, key stakeholders, including veterinarians, stock agents, consultants, extension and animal welfare officers were asked to consider a farm where the welfare of the livestock was either high or low and rate the likelihood of observing these factors. Of the 141 responses, 65% were for farms with low welfare. Only 6% of factors had ratings that were not significantly different between high and low welfare surveys, and these were not considered further. Factors from poor welfare surveys with median ratings in the lowest 25% were considered potential risks (n = 49). Considering correlation, ease of verification and the different livestock farming systems in Australia, 18 risk factors relating to farm infrastructure, nutrition, treatment and husbandry were selected. The AWRAT requires validation in future studies.
The COVID-19 pandemic has presented numerous challenges to older adults in Canada, including the ability to volunteer. The purpose of this study is to improve the understanding of the social context surrounding volunteering in Canada, by (a) determining changes in associations between human, social, and cultural capital and volunteering among older adults; and (b) examining the relationship between ethnic minority status and volunteering, using data from the Canadian Longitudinal Study on Aging (CLSA), collected prior to and during the pandemic. This study utilized data from 24,306 CLSA Baseline, Follow-up 1 (FUP1), and COVID-19 Baseline Survey participants (aged 55+). Results confirm a decrease in volunteering during the early stages of the pandemic. Compared to pre-pandemic associations, volunteers during the early stages of the pandemic were more likely to be young–old, male, employed, and not involved in religious activities. Findings provide evidence of pandemic effects on volunteering among older adults in Canada.
Based on calibrated radiocarbon ages of terrestrial gastropod shells (Succineidae, Discus, Stenotrema, Webbhelix), the chronology of Peoria Silt (loess) deposition in the Central Lowlands is updated. These taxa provide reliable ages (within ~0.2 ka), based on historical shell dating, shell-organic age comparisons, and stratigraphic consistency. A compilation of 53 new and 36 published Peoria Silt shell ages (calibrated), from 12 localities, date from 30.0 to 17.4 ka. Proximal (fossiliferous) loess from 10 sections had mean loess accumulation rates of 0.6–2.2 mm/yr. Study sites along the upper Mississippi, Illinois, to mid-Mississippi, and Ohio-Wabash Valleys suggest Peoria loess accumulated from ~27 to 15 ka, ~29 to 18 ka, and ~30 to 18 ka, respectively. The cessation age for Peoria Silt, based on surface extrapolations, is ~1–6 ka earlier than some prior Illinois estimates, even assuming slower loess accumulation in the modern solum. Younger loess in northwestern Illinois likely reflects, in part, Superior and Des Moines Lobe glacial-meltwater sediment, and Iowan Erosion Surface inputs to the upper Mississippi Valley, after the Lake Michigan Lobe receded. Furthermore, stronger winds, drier conditions, and reduced vegetation cover in valley deflation areas may have favored higher accumulation rates and later loess deposition in northwestern relative to southeastern areas.
This study compared the likelihood of long-term sequelae following infection with SARS-CoV-2 variants, other acute respiratory infections (ARIs) and non-infected individuals. Participants (n=5,630) were drawn from Virus Watch, a prospective community cohort investigating SARS-CoV-2 epidemiology in England. Using logistic regression, we compared predicted probabilities of developing long-term symptoms (>2 months) during different variant dominance periods according to infection status (SARS-CoV-2, other ARI, or no infection), adjusting for confounding by demographic and clinical factors and vaccination status. SARS-CoV-2 infection during early variant periods up to Omicron BA.1 was associated with greater probability of long-term sequalae (adjusted predicted probability (PP) range 0.27, 95% CI = 0.22–0.33 to 0.34, 95% CI = 0.25–0.43) compared with later Omicron sub-variants (PP range 0.11, 95% CI 0.08–0.15 to 0.14, 95% CI 0.10–0.18). While differences between SARS-CoV-2 and other ARIs (PP range 0.08, 95% CI 0.04–0.11 to 0.23, 95% CI 0.18–0.28) varied by period, all post-infection estimates substantially exceeded those for non-infected participants (PP range 0.01, 95% CI 0.00, 0.02 to 0.03, 95% CI 0.01–0.06). Variant was an important predictor of SARS-CoV-2 post-infection sequalae, with recent Omicron sub-variants demonstrating similar probabilities to other contemporaneous ARIs. Further aetiological investigation including between-pathogen comparison is recommended.