We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Identifying optimal methods for sampling surfaces in the healthcare environment is critical for future research requiring the identification of multidrug-resistant organisms (MDROs) on surfaces.
Methods:
We compared 2 swabbing methods, use of a flocked swab versus a sponge-stick, for recovery of MDROs by both culture and recovery of bacterial DNA via quantitative 16S polymerase chain reaction (PCR). This comparison was conducted by assessing swab performance in a longitudinal survey of MDRO contamination in hospital rooms. Additionally, a laboratory-prepared surface was also used to compare the recovery of each swab type with a matching surface area.
Results:
Sponge-sticks were superior to flocked swabs for culture-based recovery of MDROs, with a sensitivity of 80% compared to 58%. Similarly, sponge-sticks demonstrated greater recovery of Staphylococcus aureus from laboratory-prepared surfaces, although the performance of flocked swabs improved when premoistened. In contrast, recovery of bacterial DNA via quantitative 16S PCR was greater with flocked swabs by an average of 3 log copies per specimen.
Conclusions:
The optimal swabbing method of environmental surfaces differs by method of analysis. Sponge-sticks were superior to flocked swabs for culture-based detection of bacteria but inferior for recovery of bacterial DNA.
The gut microbiome is impacted by certain types of dietary fibre. However, the type, duration and dose needed to elicit gut microbial changes and whether these changes also influence microbial metabolites remain unclear. This study investigated the effects of supplementing healthy participants with two types of non-digestible carbohydrates (resistant starch (RS) and polydextrose (PD)) on the stool microbiota and microbial metabolite concentrations in plasma, stool and urine, as secondary outcomes in the Dietary Intervention Stem Cells and Colorectal Cancer (DISC) Study. The DISC study was a double-blind, randomised controlled trial that supplemented healthy participants with RS and/or PD or placebo for 50 d in a 2 × 2 factorial design. DNA was extracted from stool samples collected pre- and post-intervention, and V4 16S rRNA gene sequencing was used to profile the gut microbiota. Metabolite concentrations were measured in stool, plasma and urine by high-performance liquid chromatography. A total of fifty-eight participants with paired samples available were included. After 50 d, no effects of RS or PD were detected on composition of the gut microbiota diversity (alpha- and beta-diversity), on genus relative abundance or on metabolite concentrations. However, Drichlet’s multinomial mixture clustering-based approach suggests that some participants changed microbial enterotype post-intervention. The gut microbiota and fecal, plasma and urinary microbial metabolites were stable in response to a 50-d fibre intervention in middle-aged adults. Larger and longer studies, including those which explore the effects of specific fibre sub-types, may be required to determine the relationships between fibre intake, the gut microbiome and host health.
Clostridioides difficile infection (CDI) research relies upon accurate identification of cases when using electronic health record (EHR) data. We developed and validated a multi-component algorithm to identify hospital-associated CDI using EHR data and determined that the tandem of CDI-specific treatment and laboratory testing has 97% accuracy in identifying HA-CDI cases.
Introducing soybean cultivars resistant to 2,4-D and dicamba allowed for postemergence applications of these herbicides. These herbicides pose a high risk for off-target movement, and the potential influence on crops such as hemp is unknown. Two studies were conducted from 2020 through 2022 in controlled environments to evaluate hemp response to rates simulating off-target events of 2,4-D and dicamba. The objectives of these studies were to (1) determine the effects of herbicide (2,4-D and dicamba) and rate (1× to 1/100,000× labeled rate) on visible injury, height, and branching, and (2) determine the effect of 2,4-D rate (1× to 1/100,000× labeled rate) on visible injury, height, branching, and reproductive parameters. Herbicides were applied in the early vegetative stage, and evaluations took place 14 and 28 d after treatment (DAT) and at trial termination (42 DAT in the greenhouse trial and at harvest in the growth chamber trial). In the greenhouse study, 2,4-D and dicamba at the 1× rate, and the 1/10× rate of dicamba, caused 68%, 78%, and 20% injury 28 DAT, respectively. At the time of trial termination 42 DAT, plants treated with 1× rates of 2,4-D and dicamba, or 1/10× dicamba, were 19, 25, and 9 cm shorter than the nontreated control, respectively. Simulated off-target rates of 2,4-D and dicamba did not influence branching or plant weight at trial termination. In the growth chamber study, the 1× and 1/10× rates of 2,4-D caused 82% and 2% injury 28 DAT, respectively. Plant height, fresh weight, and cannabidiol (CBD) levels of plants treated with simulated off-target rates of 2,4-D were not different from the nontreated control. These studies suggest that hemp grown for CBD exposed to off-target rates of 2,4-D or dicamba in early vegetative stages may not have distinguishable effects 42 DAT or at harvest.
The current study had two primary objectives: 1) To assess the dose-response relationship between acute bouts of aerobic exercise intensity and performance in multiple cognitive domains (episodic memory, attention, and executive function) and 2) To replicate and extend the literature by examining the dose-response relationship between aerobic exercise intensity and pattern separation.
Participants and Methods:
18 young adults (mean age = 21.6, sd = 2.6; mean education = 13.9, sd = 3.4; 50% female) were recruited from The Ohio State University and surrounding area (Columbus, OH). Participants completed control (no exercise), light intensity, and vigorous intensity exercise conditions across three counterbalanced appointments. For each participant, all three appointments occurred at approximately the same time of day with at least 2 days between appointments. Following the rest or exercise conditions and after an approximately 7 minute delay, participants completed a Mnemonic Similarity Task (MST; Stark et al., 2019) to assess pattern separation. This task was always administered first as we attempted to replicate previous studies and further clarify the relationship between acute bouts of aerobic exercise and pattern separation by implementing an exercise stimulus that varied in intensity. After the MST, three brief cognitive tasks (roughly 5 min each) were administered in a counterbalanced order: a gradual-onset continuous performance task (gradCPT; Esterman et al., 2013), the flanker task from the NIH toolbox, and a face-name episodic memory task. Here we report results from the gradCPT, which assesses sustained attention and inhibitory control. Heart rate and ratings of perceived exertion were collected to validate the rest and exercise conditions. Repeated-measures ANOVAs were used to assess the relationship between exercise condition and dependent measures of sustained attention and inhibitory control and pattern separation.
Results:
One-way repeated-measures ANOVAs revealed a main effect of exercise condition on gradCPT task performance for task discrimination ability (d') and commission error rate (p’s < .05). Pairwise comparisons revealed task discrimination ability was significantly higher following the light intensity exercise condition versus the control condition. Commission error rate was significantly lower for both the light and vigorous exercise conditions compared to the control condition. For the MST, two-way repeated-measures ANOVAs revealed an expected significant main effect of lure similarity on task performance; however, there was not a significant main effect of exercise intensity on task performance (or a significant interaction).
Conclusions:
The current study indicated that acute bouts of exercise improve both sustained attention and inhibitory control as measured with the gradCPT. We did not replicate previous work reporting that acute bouts of exercise improve pattern separation in young adults. Our results further indicate that vigorous exercise did not detrimentally impact or improve pattern separation performance. Our results indicate that light intensity exercise is sufficient to enhance sustained attention and inhibitory control, as there were no significant differences in performance following light versus vigorous exercise.
The Pediatric Epilepsy Research Consortium (PERC) Epilepsy Surgery Database Project is a multisite collaborative that includes neuropsychological evaluations of children presenting for epilepsy surgery. There is some evidence for specific neuropsychological phenotypes within epilepsy (Hermann et al, 2016); however, this is less clear in pediatric patients. As a first step, we applied an empirically-based subtyping approach to determine if there were specific profiles using indices from the Wechsler scales [Verbal IQ (VIQ), Nonverbal IQ (NVIQ), Processing Speed Index (PSI), Working Memory Index (WMI)]. We hypothesized that there would be at least four profiles that are distinguished by slow processing speed and poor working memory as well as profiles with significant differences between verbal and nonverbal reasoning abilities.
Participants and Methods:
Our study included 372 children (M=12.1 years SD=4.1; 77.4% White; 48% male) who completed an age-appropriate Wechsler measure, enough to render at least two index scores. Epilepsy characteristics included 84.4% with focal epilepsy (evenly distributed between left and right focus) and 13.5% with generalized or mixed seizure types; mean age of onset = 6.7 years, SD = 4.5; seizure frequency ranged from daily to less than monthly; 53% had structural etiology; 71% had an abnormal MRI; and mean number of antiseizure medications was two. Latent profile analysis was used to identify discrete underlying cognitive profiles based on intellectual functioning. Demographic and epilepsy characteristics were compared among profiles.
Results:
Based on class enumeration procedures, a 3-cluster solution provided the best fit for the data, with profiles characterized by generally Average, Low Average, or Below Average functioning. 32.8% were in the Average profile with mean index scores ranging from 91.7-103.2; 47.6% were in the Low Average profile with mean index ranging from 80.7 to 84.5; and 19.6% were in the Below Average profile with mean index scores ranging from 55.0-63.1. Across all profiles, the lowest mean score was the PSI, followed by WMI. VIQ and NVIQ represented relatively higher scores for all three profiles. Mean discrepancy between indices within a profile was as large as 11.5 IQ points. No demographics or epilepsy characteristics were significantly different across cognitive phenotypes.
Conclusions:
Latent cognitive phenotypes in a pediatric presurgical cohort were differentiated by general level of functioning; however, across profiles, processing speed was consistently the lowest index followed by working memory. These findings across phenotypes suggest a common relative weakness which may result from a global effect of antiseizure medications and/or the widespread impact of seizures on neural networks even in a largely focal epilepsy cohort; similar to adult studies with temporal lobe epilepsy (Hermann et al, 2007). Future work will use latent profile analysis to examine phenotypes across other domains relevant to pediatric epilepsy including attention, naming, motor, and memory functioning. These findings are in line with collaborative efforts towards cognitive phenotyping which is the aim of our PERC Epilepsy Surgery Database Project that has already established one of the largest pediatric epilepsy surgery cohorts.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
New technologies and disruptions related to Coronavirus disease-2019 have led to expansion of decentralized approaches to clinical trials. Remote tools and methods hold promise for increasing trial efficiency and reducing burdens and barriers by facilitating participation outside of traditional clinical settings and taking studies directly to participants. The Trial Innovation Network, established in 2016 by the National Center for Advancing Clinical and Translational Science to address critical roadblocks in clinical research and accelerate the translational research process, has consulted on over 400 research study proposals to date. Its recommendations for decentralized approaches have included eConsent, participant-informed study design, remote intervention, study task reminders, social media recruitment, and return of results for participants. Some clinical trial elements have worked well when decentralized, while others, including remote recruitment and patient monitoring, need further refinement and assessment to determine their value. Partially decentralized, or “hybrid” trials, offer a first step to optimizing remote methods. Decentralized processes demonstrate potential to improve urban-rural diversity, but their impact on inclusion of racially and ethnically marginalized populations requires further study. To optimize inclusive participation in decentralized clinical trials, efforts must be made to build trust among marginalized communities, and to ensure access to remote technology.
The fourteenth canon of the Sixth Council of Toledo (638) declares it inhuman (inhumanum) not to reward fidelity. This reveals that the council had a concept of ‘human nature’ and that it was ready to use it to discipline and punish. This chapter works to uncover that seventh-century Visigothic ontology and its relationship to faith, and, in the process, reveals how by this ontological discourse an ontotheology that excluded Jews from human society emerged.
Keywords: Visigothic ontology, ideology, theology, Judaism, Catholicism, Isidore of Seville
‘I know that I am a human being.’ In order to see how unclear the sense of this proposition is, consider its negation.
– Wittgenstein, On Certainty
The role of fear in the ordering of Visigothic society: Reason speaking to Man: ‘Let the destruction of godless people draw you back from sin; […] let the extinction of the condemned pull you aside.’
– Isidore, Synonyms, 1.51
Well then, my perfect historian must start with two indispensable qualifications: the one is political insight, the other the faculty of expression.
– Lucian, The Way to Write History, 34
The following research represents the early findings of my current monograph project in which I propose that Visigothic Catholicism – and perhaps Catholicism more broadly in Late Antiquity – functioned, or intended to function, as secular ideology and not as religion. Instead of reflecting the History-shattering Truth Event that was the Christ Event and the alternative truths that Jesus demanded of his faithful subjects – such as the full renunciation of wealth – Visigothic Catholicism advocated and performed as a false commitment to the Christ Event, as a commitment, instead, to other prevailing truths of Late Antiquity but with the appearance of being Christian (i.e. faithful to the radical Christian Truth). As such, this means two things:
1. Visigothic Catholicism operated as secular ideology that used the identifier ‘Christian’ as an Imaginary Subjectivity to prevent the encounter with the Real, with the genuine Christian Truth.
2. It is in this gap between conservative, ideological operation and professed commitment to a radical, anti-historical (i.e. anti-ideological) Event that we can see the essence of Visigothic Catholicism, its real intentions, the meaning of its acting-out, and its anti-Christian, non-transformative discourse.
With the exponential growth in investment attention to brain health—solutions spanning brain wellness to mental health to neurological disorders—tech giants, payers, and biotechnology companies have been making forays into this field to identify technology solutions and pharmaceutical amplifiers. So far, their investments have had mixed results. The concept of open innovation (OI) was first coined by Henry Chesbrough to describe the paradigm by which enterprises allow free flow of ideas, products, and services from the outside to the inside and vice versa in order to remain competitive, particularly in rapidly evolving fields where there is abundant, relevant knowledge outside the traditional walls of the enterprise. In this article, we advocate for further exploration and advancement of OI in brain health.
Network modeling has been applied in a range of trauma-exposed samples, yet results are limited by an over reliance on cross-sectional data. The current analyses used posttraumatic stress disorder (PTSD) symptom data collected over a 5-year period to estimate a more robust between-subject network and an associated symptom change network.
Methods
A PTSD symptom network is measured in a sample of military veterans across four time points (Ns = 1254, 1231, 1106, 925). The repeated measures permit isolating between-subject associations by limiting the effects of within-subject variability. The result is a highly reliable PTSD symptom network. A symptom slope network depicting covariation of symptom change over time is also estimated.
Results
Negative trauma-related emotions had particularly strong associations with the network. Trauma-related amnesia, sleep disturbance, and self-destructive behavior had weaker overall associations with other PTSD symptoms.
Conclusions
PTSD's network structure appears stable over time. There is no single ‘most important’ node or node cluster. The relevance of self-destructive behavior, sleep disturbance, and trauma-related amnesia to the PTSD construct may deserve additional consideration.
To examine the association between adherence to plant-based diets and mortality.
Design:
Prospective study. We calculated a plant-based diet index (PDI) by assigning positive scores to plant foods and reverse scores to animal foods. We also created a healthful PDI (hPDI) and an unhealthful PDI (uPDI) by further separating the healthy plant foods from less-healthy plant foods.
Setting:
The VA Million Veteran Program.
Participants:
315 919 men and women aged 19–104 years who completed a FFQ at the baseline.
Results:
We documented 31 136 deaths during the follow-up. A higher PDI was significantly associated with lower total mortality (hazard ratio (HR) comparing extreme deciles = 0·75, 95 % CI: 0·71, 0·79, Ptrend < 0·001]. We observed an inverse association between hPDI and total mortality (HR comparing extreme deciles = 0·64, 95 % CI: 0·61, 0·68, Ptrend < 0·001), whereas uPDI was positively associated with total mortality (HR comparing extreme deciles = 1·41, 95 % CI: 1·33, 1·49, Ptrend < 0·001). Similar significant associations of PDI, hPDI and uPDI were also observed for CVD and cancer mortality. The associations between the PDI and total mortality were consistent among African and European American participants, and participants free from CVD and cancer and those who were diagnosed with major chronic disease at baseline.
Conclusions:
A greater adherence to a plant-based diet was associated with substantially lower total mortality in this large population of veterans. These findings support recommending plant-rich dietary patterns for the prevention of major chronic diseases.
To examine the use of telemedicine among Canadian concussion providers and clinics before and after the COVID-19 pandemic onset and identify barriers and facilitators for future use.
Methods:
Ninety-nine concussion clinics and healthcare providers across Canada that offered one or more clinical concussion-related service were identified using standardized online searches and approached to complete a cross-sectional online survey.
Results:
Thirty clinics or providers completed the survey and two completed subsections of the survey (response rate of 32.3%). Only 28.1% of respondents indicated that they used telemedicine to provide care prior to the COVID-19 pandemic. Providers most commonly using telemedicine prior to the pandemic were occupational therapists and physicians, while the most commonly used services were in-person videoconferencing and eConsultation. Most respondents (87%) indicated their clinic’s use of telemedicine changed following the onset of the COVID-19 pandemic including new use of in-person video-conferencing, telephone calls, and eConsultation. Ninety-three percent indicated that they would consider using telemedicine to provide care to their concussion patients once the pandemic was over. Barriers needed to be overcome to facilitate use or greater use of telemedicine-based services were the inability to conduct a complete physical examination, lack of appropriate reimbursement, lack of start-up, and maintenance funding and medico-legal risk.
Conclusion:
Telemedicine was used by a minority of Canadian concussion clinics and providers prior to the COVID-19 pandemic but was rapidly adopted by many facilities. This study provides important insight into the factors that must be considered to optimize use of telemedicine in concussion care in the future.
The spatial and temporal extent of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) environmental contamination has not been precisely defined. We sought to elucidate contamination of different surface types and how contamination changes over time.
Methods:
We sampled surfaces longitudinally within COVID-19 patient rooms, performed quantitative RT-PCR for the detection of SARS-CoV-2 RNA, and modeled distance, time, and severity of illness on the probability of detecting SARS-CoV-2 using a mixed-effects binomial model.
Results:
The probability of detecting SARS-CoV-2 RNA in a patient room did not vary with distance. However, we found that surface type predicted probability of detection, with floors and high-touch surfaces having the highest probability of detection: floors (odds ratio [OR], 67.8; 95% credible interval [CrI], 36.3–131) and high-touch elevated surfaces (OR, 7.39; 95% CrI, 4.31–13.1). Increased surface contamination was observed in room where patients required high-flow oxygen, positive airway pressure, or mechanical ventilation (OR, 1.6; 95% CrI, 1.03–2.53). The probability of elevated surface contamination decayed with prolonged hospitalization, but the probability of floor detection increased with the duration of the local pandemic wave.
Conclusions:
Distance from a patient’s bed did not predict SARS-CoV-2 RNA deposition in patient rooms, but surface type, severity of illness, and time from local pandemic wave predicted surface deposition.
We prospectively surveyed SARS-CoV-2 RNA contamination in staff common areas within an acute-care hospital. An increasing prevalence of surface contamination was detected over time. Adjusting for patient census or community incidence of coronavirus disease 2019 (COVID-19), the proportion of contaminated surfaces did not predict healthcare worker COVID-19 infection on study units.
Eggs contain important compounds related to enhanced cognition, but it is not clear if egg consumption, as a whole, has a direct impact on memory decline in older adults. This study aimed to determine whether egg intake levels predict the rate of memory decline in healthy older adults after sociodemographic and dietary controls. We conducted a secondary analysis of data from 470 participants, age 50 and over, from the Biospsychosocial Religion and Health Study. Participants completed a food frequency questionnaire, which was used to calculate egg intake and divide participants into Low (<23 g/week, about half an egg), Intermediate (24–63 g/week, half to 1½ eggs) and High (≥63 g/week, about two or more eggs) tertiles. Participants were administered the California Verbal Learning Test – 2nd Edition (CVLT-II) Short Form in 2006–2007, and 294 of them were again tested in 2010–2011. Using linear mixed model analysis, no significant cross-sectional differences were observed in CVLT-II performance between egg intake levels after controlling for age, sex, race, education, body mass index, cardiovascular risk, depression and intake of meat, fish, dairy and fruits/vegetables. Longitudinally, the Intermediate egg group exhibited significantly slower rates of decline on the CVLT-II compared to the Low egg group. The High egg group also exhibited slower rates of decline, but not statistically significant. Thus, limited consumption of eggs (about 1 egg/week) was associated with slower memory decline in late life compared to consuming little to no eggs, but a dose-response effect was not clearly evident. This study may help explain discrepancies in previous research that did not control for other dietary intakes and risk factors.
Multidrug-resistant organisms (MDROs) colonizing the healthcare environment have been shown to contribute to risk for healthcare-associated infections (HAIs), with adverse effects on patient morbidity and mortality. We sought to determine how bacterial contamination and persistent MDRO colonization of the healthcare environment are related to the position of patients and wastewater sites.
Methods:
We performed a prospective cohort study, enrolling 51 hospital rooms at the time of admitting a patient with an eligible MDRO in the prior 30 days. We performed systematic sampling and MDRO culture of rooms, as well as 16S rRNA sequencing to define the environmental microbiome in a subset of samples.
Results:
The probability of detecting resistant gram-negative organisms, including Enterobacterales, Acinetobacter spp, and Pseudomonas spp, increased with distance from the patient. In contrast, Clostridioides difficile and methicillin-resistant Staphylococcus aureus were more likely to be detected close to the patient. Resistant Pseudomonas spp and S. aureus were enriched in these hot spots despite broad deposition of 16S rRNA gene sequences assigned to the same genera, suggesting modifiable factors that permit the persistence of these MDROs.
Conclusions:
MDRO hot spots can be defined by distance from the patient and from wastewater reservoirs. Evaluating how MDROs are enriched relative to bacterial DNA deposition helps to identify healthcare micro-environments and suggests how targeted environmental cleaning or design approaches could prevent MDRO persistence and reduce infection risk.
Warfare on the periphery of Europe and across cultural boundaries is a particular focus of this volume. One article, on Castilian seapower, treats the melding of northern and southern naval traditions; another clarifies the military roles of the Ayyubid and Mamluk miners and stoneworkers in siege warfare; a third emphasizes cultural considerations in an Icelandic conflict; a fourth looks at how an Iberian prelate navigated the line between ecclesiastical and military responsibilities; and a fifth analyzes the different roles of early gunpowder weapons in Europe and China, linking technological history with the significance of human geography. Further contributions also consider technology, two dealing with fifteenth-century English artillery and the third with prefabricated mechanical artillery during the Crusades. Another theme of the volume is source criticism, with re-examinations of the sources for Owain Glyndwr's (possible) victory at Hyddgen in 1401, a (possible) Danish attack on England in 1128, and the role of non-milites in Salian warfare. Contributors: Nicolas Agrait, Tonio Andrade, David Bachrach, Oren Falk, Devin Fields, Michael S. Fulton, Thomas K. Heeboll-Holm, Rabei G. Khamisy, Michael Livingstone, Dan Spencer, L.J. Andrew Villalon
The objectives were to examine clinical characteristics, length of recovery, and the prevalence of delayed physician-documented recovery, compare clinical outcomes among those with sport-related concussion (SRC) and non-sport-related concussion (nSRC), and identify risk factors for delayed recovery.
Methods:
Included patients (8–18 years) were assessed ≤14 days post-injury at a multidisciplinary concussion program and diagnosed with an acute SRC or nSRC. Physician-documented clinical recovery was defined as returning to pre-injury symptom status, attending full-time school without symptoms, completing Return-to-Sport strategy as needed, and normal physical examination. Delayed physician-documented recovery was defined as >28 days post-injury.
Results:
Four hundred and fifteen patients were included (77.8% SRC). There was no difference in loss of consciousness (SRC: 9.9% vs nSRC: 13.0%, p = 0.39) or post-traumatic amnesia (SRC: 24.1% vs SRC: 31.5%, p = 0.15) at the time of injury or any differences in median Post-Concussion Symptom Scale scores (SRC: 20 vs nSRC: 23, p = 0.15) at initial assessment. Among those with complete clinical follow-up, the median physician-documented clinical recovery was 20 days (SRC: 19 vs nSRC: 23; p = 0.37). There was no difference in the proportion of patients who developed delayed physician-documented recovery (SRC: 27.7% vs nSRC: 36.1%; p = 0.19). Higher initial symptom score increased the risk of delayed physician-documented recovery (IRR: 1.39; 95% CI: 1.29, 1.49). Greater material deprivation and social deprivation were associated with an increased risk of delayed physician-documented recovery.
Conclusions:
Most pediatric concussion patients who undergo early medical assessment and complete follow-up appear to make a complete clinical recovery within 4 weeks, regardless of mechanism.