We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To estimate incidence and healthcare costs and mortality associated with Clostridioides difficile infection (CDI) among adults <65 years old.
Design:
Retrospective cohort study.
Patients:
First CDI episodes among commercially insured US patients 18–64 years old were identified from a large claims database. CDI+ patients were propensity score−matched (PSM) 1:1 with CDI− controls using clinically relevant variables including comorbidities.
Methods:
Annual CDI incidence was calculated by age group and year (2015−2019). Healthcare utilization, costs, and mortality were analyzed by age group, acquisition (healthcare and community), and hospitalization status by calculating CDI-excess costs and mortality as the difference between PSM CDI+ and CDI− individuals.
Results:
In 50–64- and 18–49-year-olds, respective CDI incidence per 100,000 person-years decreased from 217 and 113 cases in 2015 to 167 and 87 cases in 2019. Most cases (76.5%–86.9%) were community-associated. The costs and mortality analyses included 6,332 matched CDI+/− 50–64-year-olds and 6,667 CDI+/− 18–49-year-olds. Among 50–64-year-olds, mean 2-month healthcare and patients’ out-of-pocket costs were $11,634 and $573 higher, respectively, in the CDI+ versus CDI− group. Among 18–49-year-olds, 2-month costs were $7,826 and $642 higher. Healthcare costs were higher for healthcare- versus community-associated CDI. At the 12-month follow-up, mortality was significantly higher in the CDI+ versus CDI− groups for both 50–64-year-olds (4.2% vs 2.0%; P < .001) and 18–49-year-olds (1.2% vs 0.6%; P < .001). Mortality rates were higher for hospitalized versus nonhospitalized CDI+ patients.
Conclusions:
Prevention of CDI among adults 18–64 years old may significantly reduce costs and mortality.
Most FDA-approved ADHD treatments increase norepinephrine (NE) and dopamine (DA); however, our prior preclinical studies of the non-stimulant ADHD treatment viloxazine ER (Qelbree®) demonstrated that viloxazine also increases serotonin (5-HT). A prior microdialysis study showed increases in NE, DA, and 5-HT in the rat prefrontal cortex (PFC); however, the 50 mg/kg dose resulted in supratherapeutic plasma concentrations. Viloxazine is a moderate affinity selective NE reuptake inhibitor, structurally different than traditional SSRI antidepressants. Viloxazine has negligible activity at the serotonin reuptake transporter (SERT), suggesting viloxazine has a different mechanism of 5-HT PFC elevation than SSRIs. The current microdialysis study was undertaken to further characterize if viloxazine affects 5-HT and its 5-HIAA metabolite at therapeutically relevant plasma concentrations. Results are compared to similar microdialysis studies of SSRIs.
Methods
Rats were implanted with I-shaped microdialysis probes connected to a microperfusion pump, delivering artificial cerebrospinal fluid, in the PFC. After a 2-hour baseline period, viloxazine (1, 3, 10, or 30 mg/kg) was administered (ip). Dialysate samples were collected from the interstitial fluid (ISF) of the PFC before and after dosing. LC-MS/MS was used to determine the dialysate concentrations of viloxazine and viloxazine-induced changes in NE, 5-HT, and their respective metabolites, DHPG and 5-HIAA. Viloxazine plasma concentrations were also measured.
Animal research was approved by the Institutional Animal Care and Use Committee and conducted in accordance with the National Research Council’s Guide for the Care and Use of Laboratory Animals.
Results
Viloxazine administration resulted in significant dose-dependent increases in ISF NE levels and corresponding decreases in DHPG (NE metabolite) at all doses tested, reflecting viloxazine’s activity as a NET inhibitor. Viloxazine treatment also resulted in a dose-dependent elevation of ISF 5-HT levels in the PFC. Of the doses tested, 30 mg/kg was found to be clinically relevant as it induced ISF concentrations approximating unbound plasma concentrations in pediatric ADHD patients. At this dose, 5-HT levels were significantly increased over baseline and higher than vehicle levels. Coincident changes in 5-HIAA concentrations were not observed, reaffirming viloxazine’s lack of activity as a SERT inhibitor.
Conclusion
Viloxazine induced dose-dependent increases in NE and 5-HT in the PFC, a critical target region for ADHD therapies. At clinically relevant viloxazine plasma concentrations, 5-HT was increased in the PFC. Unlike SSRIs, which correspondingly decrease the 5-HT metabolite in the PFC (indicating serotonin reuptake inhibition), viloxazine did not affect 5-HIAA levels. Thus, viloxazine increases cortical 5-HT levels by a different mechanism than SSRIs. Whether 5-HT effects aid in viloxazine therapeutic efficacy in ADHD is yet unknown.
Bloodstream infections (BSIs) are a frequent cause of morbidity in patients with acute myeloid leukemia (AML), due in part to the presence of central venous access devices (CVADs) required to deliver therapy.
Objective:
To determine the differential risk of bacterial BSI during neutropenia by CVAD type in pediatric patients with AML.
Methods:
We performed a secondary analysis in a cohort of 560 pediatric patients (1,828 chemotherapy courses) receiving frontline AML chemotherapy at 17 US centers. The exposure was CVAD type at course start: tunneled externalized catheter (TEC), peripherally inserted central catheter (PICC), or totally implanted catheter (TIC). The primary outcome was course-specific incident bacterial BSI; secondary outcomes included mucosal barrier injury (MBI)-BSI and non-MBI BSI. Poisson regression was used to compute adjusted rate ratios comparing BSI occurrence during neutropenia by line type, controlling for demographic, clinical, and hospital-level characteristics.
Results:
The rate of BSI did not differ by CVAD type: 11 BSIs per 1,000 neutropenic days for TECs, 13.7 for PICCs, and 10.7 for TICs. After adjustment, there was no statistically significant association between CVAD type and BSI: PICC incident rate ratio [IRR] = 1.00 (95% confidence interval [CI], 0.75–1.32) and TIC IRR = 0.83 (95% CI, 0.49–1.41) compared to TEC. When MBI and non-MBI were examined separately, results were similar.
Conclusions:
In this large, multicenter cohort of pediatric AML patients, we found no difference in the rate of BSI during neutropenia by CVAD type. This may be due to a risk-profile for BSI that is unique to AML patients.
Infants who require open heart surgery are at increased risk for developmental delays including gross motor impairments which may have implications for later adaptive skills and cognitive performance. We sought to evaluate the feasibility and efficacy of a tummy time intervention to improve motor skill development in infants after cardiac surgery.
Methods:
Infants <4 months of age who underwent cardiac surgery were randomly assigned to tummy time with or without outpatient reinforcement or standard of care prior to hospital discharge. The Alberta Infant Motor Scale (AIMS) was administered to each infant prior to and 3 months after discharge. Groups were compared, and the association between parent-reported tummy time at home and change in motor scores at follow-up was examined.
Results:
Parents of infants (n = 64) who had cardiac surgery at a median age of 5 days were randomly assigned to tummy time instruction (n = 20), tummy time + outpatient reinforcement (n = 21) or standard of care (n = 23). Forty-nine (77%) returned for follow-up. At follow-up, reported daily tummy time was not significantly different between groups (p = 0.17). Fifteen infants had <15 minutes of tummy time daily. Infants who received >15 minutes of tummy time daily had a significantly greater improvement in motor scores than infants with <15 minutes of tummy time daily (p = 0.01).
Conclusion:
In infants following cardiac surgery, <15 minutes of tummy time daily is associated with increased motor skill impairment. Further research is needed to elucidate the best strategies to optimise parental compliance with tummy time recommendations.
Levodopa-carbidopa intestinal gel infusion (LCIG) is an established therapy for advanced Parkinson disease (PD), resulting in a significant improvement of quality of life. With increased LCIG adoption worldwide, potential complications due to abnormal vitamin absorption or metabolism have been reported in these patients. Neurologists are unfamiliar with vitamins physiology and pathophysiological mechanisms in case of their deficiency. Unfortunately, clinical and laboratory guidelines related to vitamin monitoring and supplementation in the context of treatment with LCIG are not available. We herein summarize the current knowledge on three vitamins that are reduced with LCIG therapy reporting on their physiology, laboratory testing, and clinical impact of their deficiency/excess. In addition, we proposed an opinion-based recommendation for clinicians treating LCIG patients. Patients and caregivers should be informed about the risk of vitamin deficiency. Vitamin B12, homocysteine, and methylmalonic acid (MMA) should be tested before starting LCIG, six months after and once/year thereafter. Vitamin B6 and folate testing is not universally available but it should be considered if homocysteine is elevated but MMA and/or total vitamin B12 are normal. Prophylaxis of vitamin deficiency should be started as soon as LCIG is implemented, possibly even before. Dietary recommendations are enough in most patients although a subgroup of patients is at higher risk and should receive Vitamin B12 regularly and cycles of B6. Finally, once diagnosed a vitamin deficiency should be readily treated and accompanied by clinical and laboratory monitoring. Resistant cases should receive non-oral routes of administration and possibly discontinue LCIG, even temporarily.
Antisaccade tasks can be used to index cognitive control processes, e.g. attention, behavioral inhibition, working memory, and goal maintenance in people with brain disorders. Though diagnoses of schizophrenia (SZ), schizoaffective (SAD), and bipolar I with psychosis (BDP) are typically considered to be distinct entities, previous work shows patterns of cognitive deficits differing in degree, rather than in kind, across these syndromes.
Methods
Large samples of individuals with psychotic disorders were recruited through the Bipolar-Schizophrenia Network on Intermediate Phenotypes 2 (B-SNIP2) study. Anti- and pro-saccade task performances were evaluated in 189 people with SZ, 185 people with SAD, 96 people with BDP, and 279 healthy comparison participants. Logistic functions were fitted to each group's antisaccade speed-performance tradeoff patterns.
Results
Psychosis groups had higher antisaccade error rates than the healthy group, with SZ and SAD participants committing 2 times as many errors, and BDP participants committing 1.5 times as many errors. Latencies on correctly performed antisaccade trials in SZ and SAD were longer than in healthy participants, although error trial latencies were preserved. Parameters of speed-performance tradeoff functions indicated that compared to the healthy group, SZ and SAD groups had optimal performance characterized by more errors, as well as less benefit from prolonged response latencies. Prosaccade metrics did not differ between groups.
Conclusions
With basic prosaccade mechanisms intact, the higher speed-performance tradeoff cost for antisaccade performance in psychosis cases indicates a deficit that is specific to the higher-order cognitive aspects of saccade generation.
We sought to contain a healthcare-associated coronavirus disease 2019 (COVID-19) outbreak, to evaluate contributory factors, and to prevent future outbreaks.
All patients and staff on the outbreak ward (case cluster), and randomly selected patients and staff on COVID-19 wards (positive control cluster) and a non-COVID-19 wards (negative control cluster) underwent reverse-transcriptase polymerase chain reaction (RT-PCR) testing. Hand hygiene and personal protective equipment (PPE) compliance, detection of environmental SARS-COV-2 RNA, patient behavior, and SARS-CoV-2 IgG antibody prevalence were assessed.
Results:
In total, 145 staff and 26 patients were exposed, resulting in 24 secondary cases. Also, 4 of 14 (29%) staff and 7 of 10 (70%) patients were asymptomatic or presymptomatic. There was no difference in mean cycle threshold between asymptomatic or presymptomatic versus symptomatic individuals. None of 32 randomly selected staff from the control wards tested positive. Environmental RNA detection levels were higher on the COVID-19 ward than on the negative control ward (OR, 19.98; 95% CI, 2.63–906.38; P < .001). RNA levels on the COVID-19 ward (where there were no outbreaks) and the outbreak ward were similar (OR, 2.38; P = .18). Mean monthly hand hygiene compliance, based on 20,146 observations (over preceding year), was lower on the outbreak ward (P < .006). Compared to both control wards, the proportion of staff with detectable antibodies was higher on the outbreak ward (OR, 3.78; 95% CI, 1.01–14.25; P = .008).
Conclusion:
Staff seroconversion was more likely during a short-term outbreak than from sustained duty on a COVID-19 ward. Environmental contamination and PPE use were similar on the outbreak and control wards. Patient noncompliance, decreased hand hygiene, and asymptomatic or presymptomatic transmission were more frequent on the outbreak ward.
Gas-fluidized beds of flexible fibres, which have been rarely studied before, are investigated in this work using the coupled approach of the discrete element method and computational fluid dynamics. In the present numerical method, gas–fibre interaction is modelled by calculating the interaction force for each constituent element in the fibre, and the composition of the interaction forces on the constituent elements generates a resultant hydrodynamic force and a resultant hydrodynamic torque on the fibre. Pressure drops and fibre orientation results from the present simulations with various fibre aspect ratios are in good agreement with previous experimental and simulation results. Some novel results are obtained for the effects of fibre flexibility. Larger hydrodynamic forces on fibres (before the bed is fluidized) and smaller minimum fluidization velocities (MFVs) are observed for more flexible fibre beds due to the smaller porosities, while smaller hydrodynamic forces are obtained for the more flexible fibres when the beds are fluidized with significant fibre motion. By scaling the superficial gas velocity using the MFVs, the data of pressure drop can collapse onto the Ergun correlation for stiff fibres of various aspect ratios; however, the pressure drop curves deviate from the Ergun correlation for very flexible fibres, due to the significant fibre bed expansion before the MFV is reached. The fibre aspect ratio and flexibility both have an impact on the solids mixing rate, and it is found that the solids mixing rates are essentially determined by the ratio of the superficial gas velocity to MFV.
Understanding the impact of the COVID-19 pandemic on paediatric non-COVID-19-related care, as well as patient and caregiver concerns and stressors, is critical for informing healthcare delivery. It was hypothesised that high care disruptions and psychological stress would be observed among paediatric and adult CHD patients in the early phase of the pandemic.
Methods:
A cross-sectional, international, electronic survey study was completed. Eligible participants included parents of children with acquired or CHD, adults with CHD, or caregivers of adults with CHD.
Results:
A total of 1220 participants from 25 countries completed the survey from 16 April to 4 May, 2020. Cardiac care disruption was significant with 38% reporting delays in pre-pandemic scheduled cardiac surgeries and 46% experiencing postponed cardiac clinic visits. The majority of respondents (75%) endorsed moderate to high concern about the patient with heart disease becoming ill from COVID-19. Worry about returning for in-person care was significantly greater than worry of harm to patient due to postponed care. Clinically significant psychological stress was high across the sample including children (50%), adults with CHD (42%), and caregivers (42%).
Conclusions:
The early phase of the COVID-19 pandemic contributed to considerable disruptions in cardiac care for patients with paediatric and adult CHD. COVID-19-related fears are notable with potential to impact willingness to return to in-person care. Psychological stress is also very high necessitating intervention. Further study of the impact of delays in care on clinical outcomes is warranted.
To describe epidemiologic and genomic characteristics of a severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) outbreak in a large skilled-nursing facility (SNF), and the strategies that controlled transmission.
Design, setting, and participants:
This cohort study was conducted during March 22–May 4, 2020, among all staff and residents at a 780-bed SNF in San Francisco, California.
Methods:
Contact tracing and symptom screening guided targeted testing of staff and residents; respiratory specimens were also collected through serial point prevalence surveys (PPSs) in units with confirmed cases. Cases were confirmed by real-time reverse transcription–polymerase chain reaction testing for SARS-CoV-2, and whole-genome sequencing (WGS) was used to characterize viral isolate lineages and relatedness. Infection prevention and control (IPC) interventions included restricting from work any staff who had close contact with a confirmed case; restricting movement between units; implementing surgical face masking facility-wide; and the use of recommended PPE (ie, isolation gown, gloves, N95 respirator and eye protection) for clinical interactions in units with confirmed cases.
Results:
Of 725 staff and residents tested through targeted testing and serial PPSs, 21 (3%) were SARS-CoV-2 positive: 16 (76%) staff and 5 (24%) residents. Fifteen cases (71%) were linked to a single unit. Targeted testing identified 17 cases (81%), and PPSs identified 4 cases (19%). Most cases (71%) were identified before IPC interventions could be implemented. WGS was performed on SARS-CoV-2 isolates from 4 staff and 4 residents: 5 were of Santa Clara County lineage and the 3 others were distinct lineages.
Conclusions:
Early implementation of targeted testing, serial PPSs, and multimodal IPC interventions limited SARS-CoV-2 transmission within the SNF.
The aim is to determine the disparity between the overweight and obesity prevalence of Chinese American school-aged children and adolescents as measured by the Centers for Disease Control and Prevention (CDC) growth reference and the prevalence as measured by international and ethnic-specific-growth references.
Design:
This retrospective, cross-sectional study measured overweight and obesity prevalence among a paediatric population using the CDC, International Obesity Task Force (IOTF), World Health Organization (WHO) and an ethnic Chinese growth curve.
Setting:
A community health centre in New York City, USA.
Participants:
Chinese American children aged 6–17 years in 2017 (N 9160).
Results:
The overweight prevalence was 24 % (CDC), 23 % (IOTF), 30 % (WHO) and 31 % (China). The obesity prevalence was 10 % (CDC), 5 % (IOTF), 10 % (WHO) and 10 % (China). When disaggregated by age and sex, the difference was the most prominent in girls; using the China reference compared with using the CDC reference almost doubles the overweight prevalence (school-aged: 31 v. 17 %, P < 0·001, adolescent: 27 v. 14 %, P < 0·001) and the obesity prevalence (school-aged: 11 v. 5 %, P < 0·001, adolescent: 7 v. 4 %, P < 0·001).
Conclusions:
Use of the CDC reference compared with the Chinese ethnic-specific reference results in lower overweight and obesity prevalence in Chinese American girls. Almost half of the girls who were overweight and half of the girls who were obese were not identified using the CDC reference. Using ethnic-specific references or ethnic-specific cut-points may help improve overweight identification for Chinese American children.
Literature regarding the health effect of coffee consumption was contradicting, whereby it was found to protect against type 2 diabetes while inducing acute glucose intolerance. Previous studies suggested that adding sugar and milk into coffee may alter the effect of its consumption on glucose metabolism, potentially explaining the contradiction, but this is seldom investigated. This study aimed at assessing the effect of adding milk and sugar into coffee on the postprandial glycemic metabolism after a subsequent, high glycemic index (GI) meal. A total of 11 apparently healthy adults were recruited for this randomized, cross-over acute feeding study. In each experimental session, overnight-fasted participants consumed one cup of the following drinks: espresso (35 ml), instant, boiled, and decaffeinated coffee (all 150 ml). Then they consumed a high GI meal and blood samples were collected every 15–30 minutes in the subsequent two hours. Each type of coffee was tested for 3 times, twice with 50 g low-fat milk and 7.5 g white sugar added (i.e. white coffee) and once without (i.e. black coffee). Postprandial levels of glucose, insulin, and active GLP-1, as well as the incremental area-under-curve (iAUC) of the biomarkers, were compared between black and white coffee, with baseline measurements as a covariate. Results showed that the peak glucose level after the meal was lower in white coffee sessions than black coffee sessions, regardless of coffee types. The difference was the greatest between black and white decaffeinated coffee (mean difference from baseline ± SEM, white coffee: 2.5 ± 0.2 mmol/L; black coffee: 3.3 ± 0.3 mmol/L, p = 0.019). The mean glucose iAUC of white coffee sessions was significantly smaller than black coffee sessions for all coffee types except for instant coffee (all p < 0.05). The peak insulin levels between black and white coffees were not significantly different, yet white decaffeinated coffee had a 35% smaller mean insulin iAUC than black decaffeinated coffee (p = 0.025). The active GLP-1 levels were not significantly different between black and white coffee sessions. These results showed that prior ingestion of coffee with milk and sugar added attenuated the glucose response after the subsequent meal, compared with drinking black coffee beforehand. Our results may provide an explanation for the conflicting literature regarding the protective effects of coffee consumption against type 2 diabetes.
Clostridioides difficile infection (CDI) remains a significant public health concern, resulting in excess morbidity, mortality, and costs. Additional insight into the burden of CDI in adults aged <65 years is needed.
Design/Setting:
A 6-year retrospective cohort study was conducted using data extracted from United States Veterans Health Administration electronic medical records.
Patients/Methods:
Patients aged 18–64 years on January 1, 2011, were followed until incident CDI, death, loss-to-follow-up, or December 31, 2016. CDI was identified by a diagnosis code accompanied by metronidazole, vancomycin, or fidaxomicin therapy, or positive laboratory test. The clinical setting of CDI onset was defined according to 2017 SHEA-IDSA guidelines.
Results:
Of 1,073,900 patients, 10,534 had a CDI during follow-up. The overall incidence rate was 177 CDIs per 100,000 person years, rising steadily from 164 per 100,000 person years in 2011 to 189 per 100,000 person years in 2016. Those with a CDI were slightly older (55 vs 51 years) and sicker, with a higher baseline Charlson comorbidity index score (1.4 vs 0.5) than those without an infection. Nearly half (48%) of all incident CDIs were community associated, and this proportion rose from 41% in 2011 to 56% in 2016.
Conclusions:
The findings from this large retrospective study indicate that CDI incidence, driven primarily by increasing community-associated infection, is rising among young and middle-aged adult Veterans with high service-related disability. The increasing burden of community associated CDI in this vulnerable population warrants attention. Future studies quantifying the economic and societal burden of CDI will inform decisions surrounding prevention strategies.
To evaluate the clinical impact of an antimicrobial stewardship program (ASP) on high-risk pediatric patients.
Design:
Retrospective cohort study.
Setting:
Free-standing pediatric hospital.
Patients:
This study included patients who received an ASP review between March 3, 2008, and March 2, 2017, and were considered high-risk, including patients receiving care by the neonatal intensive care (NICU), hematology/oncology (H/O), or pediatric intensive care (PICU) medical teams.
Methods:
The ASP recommendations included stopping antibiotics; modifying antibiotic type, dose, or duration; or obtaining an infectious diseases consultation. The outcomes evaluated in all high-risk patients with ASP recommendations were (1) hospital-acquired Clostridium difficile infection, (2) mortality, and (3) 30-day readmission. Subanalyses were conducted to evaluate hospital length of stay (LOS) and tracheitis treatment failure. Multivariable generalized linear models were performed to examine the relationship between ASP recommendations and each outcome after adjusting for clinical service and indication for treatment.
Results:
The ASP made 2,088 recommendations, and 50% of these recommendations were to stop antibiotics. Recommendation agreement occurred in 70% of these cases. Agreement with an ASP recommendation was not associated with higher odds of mortality or hospital readmission. Patients with a single ASP review and agreed upon recommendation had a shorter median LOS (10.2 days vs 13.2 days; P < .05). The ASP recommendations were not associated with high rates of tracheitis treatment failure.
Conclusions:
ASP recommendations do not result in worse clinical outcomes among high-risk pediatric patients. Most ASP recommendations are to stop or to narrow antimicrobial therapy. Further work is needed to enhance stewardship efforts in high-risk pediatric patients.
Currently, 564,000 Canadians are living with dementia. This number will continue to rise as the population ages. Family physicians play an integral role in the diagnosis and management of dementia patients. Although studies have looked at family physician perspectives on dementia care in the urban setting, much less is known about challenges in rural areas. This study aimed to explore rural family physicians’ experiences in caring for patients with dementia in rural Alberta, Canada. We conducted three semi-structured focus groups with 16 family physicians to evaluate barriers and facilitators to providing care to persons with dementia in three rural communities. We developed focus group questions based on the theoretical domains framework (TDF) and analysed them using a framework approach. Physician capabilities, opportunities, and motivations appear to play important roles in caring for these patients. These research findings can be used to advance quality of care for rural dementia patients.
Current policy emphasises the importance of ‘living well’ with dementia, but there has been no comprehensive synthesis of the factors related to quality of life (QoL), subjective well-being or life satisfaction in people with dementia. We examined the available evidence in a systematic review and meta-analysis. We searched electronic databases until 7 January 2016 for observational studies investigating factors associated with QoL, well-being and life satisfaction in people with dementia. Articles had to provide quantitative data and include ⩾75% people with dementia of any type or severity. We included 198 QoL studies taken from 272 articles in the meta-analysis. The analysis focused on 43 factors with sufficient data, relating to 37639 people with dementia. Generally, these factors were significantly associated with QoL, but effect sizes were often small (0.1–0.29) or negligible (<0.09). Factors reflecting relationships, social engagement and functional ability were associated with better QoL. Factors indicative of poorer physical and mental health (including depression and other neuropsychiatric symptoms) and poorer carer well-being were associated with poorer QoL. Longitudinal evidence about predictors of QoL was limited. There was a considerable between-study heterogeneity. The pattern of numerous predominantly small associations with QoL suggests a need to reconsider approaches to understanding and assessing living well with dementia.
Neurodevelopmental impairment is increasingly recognised as a potentially disabling outcome of CHD and formal evaluation is recommended for high-risk patients. However, data are lacking regarding the proportion of eligible children who actually receive neurodevelopmental evaluation, and barriers to follow-up are unclear. We examined the prevalence and risk factors associated with failure to attend neurodevelopmental follow-up clinic after infant cardiac surgery.
Methods
Survivors of infant (<1 year) cardiac surgery at our institution (4/2011-3/2014) were included. Socio-demographic and clinical characteristics were evaluated in neurodevelopmental clinic attendees and non-attendees in univariate and multivariable analyses.
Results
A total of 552 patients were included; median age at surgery was 2.4 months, 15% were premature, and 80% had moderate–severe CHD. Only 17% returned for neurodevelopmental evaluation, with a median age of 12.4 months. In univariate analysis, non-attendees were older at surgery, had lower surgical complexity, fewer non-cardiac anomalies, shorter hospital stay, and lived farther from the surgical center. Non-attendee families had lower income, and fewer were college graduates or had private insurance. In multivariable analysis, lack of private insurance remained independently associated with non-attendance (adjusted odds ratio 1.85, p=0.01), with a trend towards significance for distance from surgical center (adjusted odds ratio 2.86, p=0.054 for ⩾200 miles).
Conclusions
The majority of infants with CHD at high risk for neurodevelopmental dysfunction evaluated in this study are not receiving important neurodevelopmental evaluation. Efforts to remove financial/insurance barriers, increase access to neurodevelopmental clinics, and better delineate other barriers to receipt of neurodevelopmental evaluation are needed.
A variety of applications from insulation to catalytic supports can benefit from lightweight, high surface area, mesoporous materials, which maintain their mesoporous structure to temperatures of 900–1200 °C. Silica aerogels begin to densify by 700 °C. Alumina aerogels are capable of higher temperature exposure than their silica counterparts, but undergo successive phase transformations to form transitional aluminas prior densifying to α-alumina. The present study characterizes the phase transitions of aluminosilicate aerogels derived from Boehmite powders to elucidate the role of time and temperature on phase transitions, surface area, and morphology. Aerogel compositions stable to 1200 °C for periods of 24 h have been demonstrated.