We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
New Zealand and Australian governments rely heavily on voluntary industry initiatives to improve population nutrition, such as voluntary front-of-pack nutrition labelling (Health Star Rating [HSR]), industry-led food advertising standards, and optional food reformulation programmes. Research in both countries has shown that food companies vary considerably in their policies and practices on nutrition(1). We aimed to determine if a tailored nutrition support programme for food companies improved their nutrition policies and practices compared with control companies who were not offered the programme. REFORM was a 24-month, two-country, cluster-randomised controlled trial. 132 major packaged food/drink manufacturers (n=96) and fast-food companies (n=36) were randomly assigned (2:1 ratio) to receive a 12-month tailored support programme or to the control group (no intervention). The intervention group was offered a programme designed and delivered by public health academics comprising regular meetings, tailored company reports, and recommendations and resources to improve product composition (e.g., reducing nutrients of concern through reformulation), nutrition labelling (e.g., adoption of HSR labels), marketing to children (reducing the exposure of children to unhealthy products and brands) and improved nutrition policy and corporate sustainability reporting. The primary outcome was the nutrient profile (measured using HSR) of company food and drink products at 24 months. Secondary outcomes were the nutrient content (energy, sodium, total sugar, and saturated fat) of company products, display of HSR labels on packaged products, company nutrition-related policies and commitments, and engagement with the intervention. Eighty-eight eligible intervention companies (9,235 products at baseline) were invited to participate, of whom 21 accepted and were enrolled in the REFORM programme (delivered between September 2021 and December 2022). Forty-four companies (3,551 products at baseline) were randomised to the control arm. At 24 months, the model-adjusted mean HSR of intervention company products was 2.58 compared to 2.68 for control companies, with no significant difference between groups (mean difference -0.10, 95% CI -0.40 to 0.21, p-value 0.53). A per protocol analysis of intervention companies who enrolled in the programme compared to control companies with no major protocol violation also found no significant difference (2.93 vs 2.64, mean difference 0.29, 95% CI -0.13 to 0.72, p-value 0.18). We found no significant differences between the intervention and control groups in any secondary outcome, except in total sugar (g/100g) where the sugar content of intervention company products was higher than that of control companies (12.32 vs 6.98, mean difference 5.34, 95% CI 1.73 to 8.96, p-value 0.004). The per-protocol analysis for sugar did not show a significant difference (10.47 vs 7.44, mean difference 3.03, 95% CI -0.48 to 6.53, p-value 0.09).In conclusion, a 12-month tailored nutrition support for food companies did not improve the nutrient profile of company products.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Unhealthy food environments are major drivers of obesity and diet-related diseases(1). Improving the healthiness of food environments requires a widespread organised response from governments, civil society, and industry(2). However, current actions often rely on voluntary participation by industry, such as opt-in nutrition labelling schemes, school/workplace food guidelines, and food reformulation programmes. The aim of the REFORM study is to determine the effects of the provision of tailored support to companies on their nutrition-related policies and practices, compared to food companies that are not offered the programme (the control). REFORM is a two-country, parallel cluster randomised controlled trial. 150 food companies were randomly assigned (2:1 ratio) to receive either a tailored support intervention programme or no intervention. Randomisation was stratified by country (Australia, New Zealand), industry sector (fast food, other packaged food/beverage companies), and company size. The primary outcome is the nutrient profile (measured using Health Star Rating [HSR]) of foods and drinks produced by participating companies at 24 months post-baseline. Secondary outcomes include company nutrition policies and commitments, the nutrient content (sodium, sugar, saturated fat) of products produced by participating companies, display of HSR labels, and engagement with the intervention. Eighty-three eligible intervention companies were invited to take part in the REFORM programme and 21 (25%) accepted and were enrolled. Over 100 meetings were held with company representatives between September 2021 and December 2022. Resources and tailored reports were developed for 6 touchpoints covering product composition and benchmarking, nutrition labelling, consumer insights, nutrition policies, and incentives for companies to act on nutrition. Detailed information on programme resources and preliminary 12-month findings will be presented at the conference. The REFORM programme will assess if provision of tailored support to companies on their nutrition-related policies and practices incentivises the food industry to improve their nutrition policies and actions.
The locus coeruleus (LC) innervates the cerebrovasculature and plays a crucial role in optimal regulation of cerebral blood flow. However, no human studies to date have examined links between these systems with widely available neuroimaging methods. We quantified associations between LC structural integrity and regional cortical perfusion and probed whether varying levels of plasma Alzheimer’s disease (AD) biomarkers (Aß42/40 ratio and ptau181) moderated these relationships.
Participants and Methods:
64 dementia-free community-dwelling older adults (ages 55-87) recruited across two studies underwent structural and functional neuroimaging on the same MRI scanner. 3D-pCASL MRI measured regional cerebral blood flow in limbic and frontal cortical regions, while T1-FSE MRI quantified rostral LC-MRI contrast, a well-established proxy measure of LC structural integrity. A subset of participants underwent fasting blood draw to measure plasma AD biomarker concentrations (Aß42/40 ratio and ptau181). Multiple linear regression models examined associations between perfusion and LC integrity, with rostral LC-MRI contrast as predictor, regional CBF as outcome, and age and study as covariates. Moderation analyses included additional terms for plasma AD biomarker concentration and plasma x LC interaction.
Results:
Greater rostral LC-MRI contrast was linked to lower regional perfusion in limbic regions, such as the amygdala (ß = -0.25, p = 0.049) and entorhinal cortex (ß = -0.20, p = 0.042), but was linked to higher regional perfusion in frontal cortical regions, such as the lateral (ß = 0.28, p = 0.003) and medial (ß = 0.24, p = 0.05) orbitofrontal (OFC) cortices. Plasma amyloid levels moderated the relationship between rostral LC and amygdala CBF (Aß42/40 ratio x rostral LC interaction term ß = -0.31, p = 0.021), such that as plasma Aß42/40 ratio decreased (i.e., greater pathology), the strength of the negative relationship between rostral LC integrity and amygdala perfusion decreased. Plasma ptau181levels moderated the relationship between rostral LC and entorhinal CBF (ptau181 x rostral LC interaction term ß = 0.64, p = 0.001), such that as ptau181 increased (i.e., greater pathology), the strength of the negative relationship between rostral LC integrity and entorhinal perfusion decreased. For frontal cortical regions, ptau181 levels moderated the relationship between rostral LC and lateral OFC perfusion (ptau181 x rostral LC interaction term ß = -0.54, p = .004), as well as between rostral LC and medial OFC perfusion (ptau181 x rostral LC interaction term ß = -0.53, p = .005), such that as ptau181 increased (i.e., greater pathology), the strength of the positive relationship between rostral LC integrity and frontal perfusion decreased.
Conclusions:
LC integrity is linked to regional cortical perfusion in non-demented older adults, and these relationships are moderated by plasma AD biomarker concentrations. Variable directionality of the associations between the LC and frontal versus limbic perfusion, as well as the differential moderating effects of plasma AD biomarkers, may signify a compensatory mechanism and a shifting pattern of hyperemia in the presence of aggregating AD pathology. Linking LC integrity and cerebrovascular regulation may represent an important understudied pathway of dementia risk and may help to bridge competing theories of dementia progression in preclinical AD studies.
Episodic memory functioning is distributed across two brain circuits, one of which courses through the dorsal anterior cingulate cortex (dACC). Thus, delivering non-invasive neuromodulation technology to the dACC may improve episodic memory functioning in patients with memory problems such as in amnestic mild cognitive impairment (aMCI). This preliminary study is a randomized, double-blinded, sham-controlled clinical trial to examine if high definition transcranial direct current stimulation (HD-tDCS) can be a viable treatment in aMCI.
Participants and Methods:
Eleven aMCI participants, of whom 9 had multidomain deficits, were randomized to receive 1 mA HD-tDCS (N=7) or sham (N=4) stimulation. HD-tDCS was applied over ten 20-minute sessions targeting the dACC. Neuropsychological measures of episodic memory, verbal fluency, and executive function were completed at baseline and after the last HD-tDCS session. Changes in composite scores for memory and language/executive function tests were compared between groups (one-tailed t-tests with a = 0.10 for significance). Clinically significant change, defined as > 1 SD improvement on at least one test in the memory and non-memory domains, was compared between active and sham stimulation based on the frequency of participants in each.
Results:
No statistical or clinically significant change (N-1 X2; p = 0.62) was seen in episodic memory for the active HD-tDCS (MDiff = 4.4; SD = 17.1) or sham groups (MDiff = -0.5; SD = 9.7). However, the language and executive function composite showed statistically significant improvement (p = 0.04; MDiff = -15.3; SD = 18.4) for the active HD-tDCS group only (Sham MDiff = -5.8; SD = 10.7). Multiple participants (N=4) in the active group had clinically significant enhancement in language and executive functioning tests, while nobody in the sham group did (p = 0.04).
Conclusions:
HD-tDCS targeting the dACC had no direct benefit for episodic memory deficits in aMCI based on preliminary findings for this ongoing clinical trial. However, significant improvement in language and executive function skills occurred in response to HD-tDCS, suggesting HD-tDCS in this configuration has promising potential as an intervention for language and executive function deficits in MCI.
Marine radiocarbon (14C) ages are an important geochronology tool for the understanding of past earthquakes and tsunamis that have impacted the coastline of New Zealand. To advance this field of research, we need an improved understanding of the radiocarbon marine reservoir correction for coastal waters of New Zealand. Here we report 170 new ΔR20 (1900–1950) measurements from around New Zealand made on pre-1950 marine shells and mollusks killed by the 1931 Napier earthquake. The influence of feeding method, living depth and environmental preference on ΔR is evaluated and we find no influence from these factors except for samples living at or around the high tide mark on rocky open coastlines, which tend to have anomalously low ΔR values. We examine how ΔR varies spatially around the New Zealand coastline and identify continuous stretches of coastline with statistically similar ΔR values. We recommend subdividing the New Zealand coast into four regions with different marine reservoir corrections: A: south and western South Island, ΔR20 –113 ± 33 yr, B: Cook Strait and western North Island, ΔR20 –171 ± 29 yr, C: northeastern North Island, ΔR20 –143 ± 18 yr, D: eastern North Island and eastern South Island, ΔR20 –70 ± 39 yr.
The current small study utilised prospective data collection of patterns of prenatal alcohol and tobacco exposure (PAE and PTE) to examine associations with structural brain outcomes in 6-year-olds and served as a pilot to determine the value of prospective data describing community-level patterns of PAE and PTE in a non-clinical sample of children. Participants from the Safe Passage Study in pregnancy were approached when their child was ∼6 years old and completed structural brain magnetic resonance imaging to examine with archived PAE and PTE data (n = 51 children–mother dyads). Linear regression was used to conduct whole-brain structural analyses, with false-discovery rate (FDR) correction, to examine: (a) main effects of PAE, PTE and their interaction; and (b) predictive potential of data that reflect patterns of PAE and PTE (e.g. quantity, frequency and timing (QFT)). Associations between PAE, PTE and their interaction with brain structural measures demonstrated unique profiles of cortical and subcortical alterations that were distinct between PAE only, PTE only and their interactive effects. Analyses examining associations between patterns of PAE and PTE (e.g. QFT) were able to significantly detect brain alterations (that survived FDR correction) in this small non-clinical sample of children. These findings support the hypothesis that considering QFT and co-exposures is important for identifying brain alterations following PAE and/or PTE in a small group of young children. Current results demonstrate that teratogenic outcomes on brain structure differ as a function PAE, PTE or their co-exposures, as well as the pattern (QFT) or exposure.
We developed an agent-based model using a trial emulation approach to quantify effect measure modification of spillover effects of pre-exposure prophylaxis (PrEP) for HIV among men who have sex with men (MSM) in the Atlanta-Sandy Springs-Roswell metropolitan area, Georgia. PrEP may impact not only the individual prescribed, but also their partners and beyond, known as spillover. We simulated a two-stage randomised trial with eligible components (≥3 agents with ≥1 HIV+ agent) first randomised to intervention or control (no PrEP). Within intervention components, agents were randomised to PrEP with coverage of 70%, providing insight into a high PrEP coverage strategy. We evaluated effect modification by component-level characteristics and estimated spillover effects on HIV incidence using an extension of randomisation-based estimators. We observed an attenuation of the spillover effect when agents were in components with a higher prevalence of either drug use or bridging potential (if an agent acts as a mediator between ≥2 connected groups of agents). The estimated spillover effects were larger in magnitude among components with either higher HIV prevalence or greater density (number of existing partnerships compared to all possible partnerships). Consideration of effect modification is important when evaluating the spillover of PrEP among MSM.
To examine cross-sectional associations between farmers’ market shopping behaviours and objectively measured and self-reported fruit and vegetable (FV) intake among rural North Carolina (NC) and New York City (NYC) shoppers.
Design:
Cross-sectional intercept surveys were used to assess self-reported FV intake and three measures of farmers’ market shopping behaviour: (1) frequency of purchasing FV; (2) variety of FV purchased and (3) dollars spent on FV. Skin carotenoids, a non-invasive biomarker for FV intake, were objectively measured using pressure-mediated reflection spectroscopy. Associations between farmers’ market shopping behaviours and FV intake were examined using regression models that controlled for demographic variables (e.g. age, sex, race, smoking status, education, income and state).
Setting:
Farmers’ markets (n 17 markets) in rural NC and NYC.
Participants:
A convenience sample of 645 farmers’ market shoppers.
Results:
Farmers’ market shoppers in NYC purchased a greater variety of FV and had higher skin carotenoid scores compared with shoppers in rural NC. Among all shoppers, there was a positive, statistically significant association between self-reported frequency of shopping at farmers’ markets and self-reported as well as objectively assessed FV intake. The variety of FV purchased and farmers’ market spending on FV also were positively associated with self-reported FV intake, but not skin carotenoids.
Conclusion:
Those who shop for FV more frequently at a farmers’ markets, purchase a greater variety of FV and spend more money on FV have higher self-reported, and in some cases higher objectively measured FV intake. Further research is needed to understand these associations and test causality.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Healthcare personnel with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection were interviewed to describe activities and practices in and outside the workplace. Among 2,625 healthcare personnel, workplace-related factors that may increase infection risk were more common among nursing-home personnel than hospital personnel, whereas selected factors outside the workplace were more common among hospital personnel.
Litigation in the National Health Service continues to rise with a 9.4 per cent increase in clinical negligence claims from the period 2018 and 2019 to the period 2019 and 2020. The cost of these claims now accounts for 1.8 per cent of the National Health Service 2019 to 2020 budget. This study aimed to identify the characteristics of clinical negligence claims in the subspecialty of otology.
Methods
This study was a retrospective review of all clinical negligence claims in otology in England held by National Health Service Resolution between April 2013 and April 2018.
Results
There were 171 claims in otology, 24 per cent of all otolaryngology claims, with a potential cost of £24.5 million. Over half of these were associated with hearing loss. Stapedectomy was the highest mean cost per claim operation at £769 438. The most common reasons for litigation were failure or delay in treatment (23 per cent), failure or delay in diagnosis (20 per cent), intra-operative complications (15 per cent) and inadequate consent (13 per cent).
Conclusion
There is a risk of high-cost claims in otology, especially with objective injuries such as hearing loss and facial nerve injury.
To conduct a pilot study implementing combined genomic and epidemiologic surveillance for hospital-acquired multidrug-resistant organisms (MDROs) to predict transmission between patients and to estimate the local burden of MDRO transmission.
Design:
Pilot prospective multicenter surveillance study.
Setting:
The study was conducted in 8 university hospitals (2,800 beds total) in Melbourne, Australia (population 4.8 million), including 4 acute-care, 1 specialist cancer care, and 3 subacute-care hospitals.
Methods:
All clinical and screening isolates from hospital inpatients (April 24 to June 18, 2017) were collected for 6 MDROs: vanA VRE, MRSA, ESBL Escherichia coli (ESBL-Ec) and Klebsiella pneumoniae (ESBL-Kp), and carbapenem-resistant Pseudomonas aeruginosa (CRPa) and Acinetobacter baumannii (CRAb). Isolates were analyzed and reported as routine by hospital laboratories, underwent whole-genome sequencing at the central laboratory, and were analyzed using open-source bioinformatic tools. MDRO burden and transmission were assessed using combined genomic and epidemiologic data.
Results:
In total, 408 isolates were collected from 358 patients; 47.5% were screening isolates. ESBL-Ec was most common (52.5%), then MRSA (21.6%), vanA VRE (15.7%), and ESBL-Kp (7.6%). Most MDROs (88.3%) were isolated from patients with recent healthcare exposure.
Combining genomics and epidemiology identified that at least 27.1% of MDROs were likely acquired in a hospital; most of these transmission events would not have been detected without genomics. The highest proportion of transmission occurred with vanA VRE (88.4% of patients).
Conclusions:
Genomic and epidemiologic data from multiple institutions can feasibly be combined prospectively, providing substantial insights into the burden and distribution of MDROs, including in-hospital transmission. This analysis enables infection control teams to target interventions more effectively.
At present, analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross-validation in the BLadder cancer Epidemiology and Nutritional Determinants study, including data from eighteen case–control and one nested case–cohort study, compromising 8320 BC cases out of 31 551 participants. Dietary data, on the eleven main food groups of the Eurocode 2 Core classification codebook, and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance, beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC-related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
The aim of this guidance paper of the European Psychiatric Association is to provide evidence-based recommendations on the early detection of a clinical high risk (CHR) for psychosis in patients with mental problems. To this aim, we conducted a meta-analysis of studies reporting on conversion rates to psychosis in non-overlapping samples meeting any at least any one of the main CHR criteria: ultra-high risk (UHR) and/or basic symptoms criteria. Further, effects of potential moderators (different UHR criteria definitions, single UHR criteria and age) on conversion rates were examined. Conversion rates in the identified 42 samples with altogether more than 4000 CHR patients who had mainly been identified by UHR criteria and/or the basic symptom criterion ‘cognitive disturbances’ (COGDIS) showed considerable heterogeneity. While UHR criteria and COGDIS were related to similar conversion rates until 2-year follow-up, conversion rates of COGDIS were significantly higher thereafter. Differences in onset and frequency requirements of symptomatic UHR criteria or in their different consideration of functional decline, substance use and co-morbidity did not seem to impact on conversion rates. The ‘genetic risk and functional decline’ UHR criterion was rarely met and only showed an insignificant pooled sample effect. However, age significantly affected UHR conversion rates with lower rates in children and adolescents. Although more research into potential sources of heterogeneity in conversion rates is needed to facilitate improvement of CHR criteria, six evidence-based recommendations for an early detection of psychosis were developed as a basis for the EPA guidance on early intervention in CHR states.
Given the common view that pre-exercise nutrition/breakfast is important for performance, the present study investigated whether breakfast influences resistance exercise performance via a physiological or psychological effect. Twenty-two resistance-trained, breakfast-consuming men completed three experimental trials, consuming water-only (WAT), or semi-solid breakfasts containing 0 g/kg (PLA) or 1·5 g/kg (CHO) maltodextrin. PLA and CHO meals contained xanthan gum and low-energy flavouring (approximately 122 kJ), and subjects were told both ‘contained energy’. At 2 h post-meal, subjects completed four sets of back squat and bench press to failure at 90 % ten repetition maximum. Blood samples were taken pre-meal, 45 min and 105 min post-meal to measure serum/plasma glucose, insulin, ghrelin, glucagon-like peptide-1 and peptide tyrosine-tyrosine concentrations. Subjective hunger/fullness was also measured. Total back squat repetitions were greater in CHO (44 (sd 10) repetitions) and PLA (43 (sd 10) repetitions) than WAT (38 (sd 10) repetitions; P < 0·001). Total bench press repetitions were similar between trials (WAT 37 (sd 7) repetitions; CHO 39 (sd 7) repetitions; PLA 38 (sd 7) repetitions; P = 0·130). Performance was similar between CHO and PLA trials. Hunger was suppressed and fullness increased similarly in PLA and CHO, relative to WAT (P < 0·001). During CHO, plasma glucose was elevated at 45 min (P < 0·05), whilst serum insulin was elevated (P < 0·05) and plasma ghrelin suppressed at 45 and 105 min (P < 0·05). These results suggest that breakfast/pre-exercise nutrition enhances resistance exercise performance via a psychological effect, although a potential mediating role of hunger cannot be discounted.
Campylobacteriosis is the most common notifiable disease in New Zealand. While the risk of campylobacteriosis has been found to be strongly associated with the consumption of undercooked poultry, other risk factors include rainwater-sourced drinking water, contact with animals and consumption of raw dairy products. Despite this, there has been little investigation of raw milk as a risk factor for campylobacteriosis. Recent increases in demand for untreated or ‘raw’ milk have also raised concerns that this exposure may become a more important source of disease in the future. This study describes the cases of notified campylobacteriosis from a sentinel surveillance site. Previously collected data from notified cases of raw milk-associated campylobacteriosis were examined and compared with campylobacteriosis cases who did not report raw milk consumption. Raw milk campylobacteriosis cases differed from non-raw milk cases on comparison of age and occupation demographics, with raw milk cases more likely to be younger and categorised as children or students for occupation. Raw milk cases were more likely to be associated with outbreaks than non-raw milk cases. Study-suggested motivations for raw milk consumption (health reasons, natural product, produced on farm, inexpensive or to support locals) were not strongly supported by cases. More information about the raw milk consumption habits of New Zealanders would be helpful to better understand the risks of this disease, especially with respect to increased disease risk observed in younger people. Further discussion with raw milk consumers around their motivations may also be useful to find common ground between public health concerns and consumer preferences as efforts continue to manage this ongoing public health issue.
Electrochemical capacitors featuring a modified acetonitrile (AN) electrolyte and a binder-free, activated carbon fabric electrode material were assembled and tested at <−40 °C. The melting point of the electrolyte was depressed relative to the standard pure AN solvent through the use of a methyl formate cosolvent, to enable operation at temperatures lower than the rated limit of typical commercial cells (−40 °C). Based on earlier electrolyte formulation studies, a 1:1 ratio of methyl formate to AN (by volume) was selected, to maximize freezing point depression while maintaining a sufficient salt solubility. The salt spiro-(1,1′)-bipyrrolidinium tetrafluoroborate was used, based on its improved conductivity at low temperatures, relative to linear alkyl ammonium salts. The carbon fabric electrode supported a relatively high rate capability at temperatures as low as −65 °C with a modest increase in cell resistance at this reduced temperature. The capacitance was only weakly dependent on temperature, with a specific capacitance of ∼110 F/g.
The purpose of this study was to examine whether vehicle type based on size (car vs. other = truck/van/SUV) had an impact on the speeding, acceleration, and braking patterns of older male and female drivers (70 years and older) from a Canadian longitudinal study. The primary hypothesis was that older adults driving larger vehicles (e.g., trucks, SUVs, or vans) would be more likely to speed than those driving cars. Participants (n = 493) had a device installed in their vehicles that recorded their everyday driving. The findings suggest that the type of vehicle driven had little or no impact on per cent of time speeding or on the braking and accelerating patterns of older drivers. Given that the propensity for exceeding the speed limit was high among these older drivers, regardless of vehicle type, future research should examine what effect this behaviour has on older-driver road safety.
Epistaxis is the most common ENT emergency. This study aimed to assess one-year mortality rates in patients admitted to a large teaching hospital.
Method
This study was a retrospective case note analysis of all patients admitted to the Queen Elizabeth University Hospital in Glasgow with epistaxis over a 12-month period.
Results
The one-year overall mortality for a patient admitted with epistaxis was 9.8 per cent. The patients who died were older (mean age 77.2 vs 68.8 years; p = 0.002), had a higher Cumulative Illness Rating Scale-Geriatric score (9.9 vs 6.7; p < 0.001) and had a higher performance status score (2 or higher vs less than 2; p < 0.001). Other risk factors were a low admission haemoglobin level (less than 128 g/dl vs 128 g/dl or higher; p = 0.025), abnormal coagulation (p = 0.004), low albumin (less than 36 g/l vs more than 36 g/l; p < 0.001) and longer length of stay (p = 0.046).
Conclusion
There are a number of risk factors associated with increased mortality after admission with epistaxis. This information could help with risk stratification of patients at admission and enable the appropriate patient support to be arranged.