We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Neurodevelopmental challenges are the most prevalent comorbidity associated with a diagnosis of critical CHD, and there is a high incidence of gross and fine motor delays noted in early infancy. The frequency of motor delays in hospitalised infants with critical CHD requires close monitoring from developmental therapies (physical therapists, occupational therapists, and speech-language pathologists) to optimise motor development. Currently, minimal literature defines developmental therapists’ role in caring for infants with critical CHD in intensive or acute care hospital units.
Purpose:
This article describes typical infant motor skill development, how the hospital environment and events surrounding early cardiac surgical interventions impact those skills, and how developmental therapists support motor skill acquisition in infants with critical CHD. Recommendations for healthcare professionals and those who provide medical or developmental support in promotion of optimal motor skill development in hospitalised infants with critical CHD are discussed.
Conclusions:
Infants with critical CHD requiring neonatal surgical intervention experience interrupted motor skill interactions and developmental trajectories. As part of the interdisciplinary team working in intensive and acute care settings, developmental therapists assess, guide motor intervention, promote optimal motor skill acquisition, and support the infant’s overall development.
National standards to ensure effective transition and smooth transfer of adolescents from paediatric to adult services are available but data on successful transition in CHD are limited. The aim of this study is to assess the effectiveness of our transition pathway.
Methods:
Adolescents with CHD, aged 15–19 years, who attended the joint cardiac transition clinic between 2009 and 2018 were identified from the Patient Administration Systems. Patient attendance at their first adult CHD service appointment at Royal Papworth Hospital was recorded.
Results:
179 adolescents were seen in the joint cardiac transition clinic in the 9-year study period. The median age of the patients when seen was 16 (range 15–19) years. 145 patients were initially planned for transfer to the Royal Papworth Hospital adult CHD service. Three patients were subsequently excluded and the success of the transfer of care in 142 patients were analysed. 112 (78%) attended their first follow-up in the adult CHD clinic as planned, 28 (20%) attended after reminders were sent out with 5/28 requiring multiple reminders, and only 2 (1.4%) failed to attend. Overall, transfer of care was achieved in 140 (98.6%) patients.
Conclusion:
A dedicated joint cardiac transition clinic involving multi-professional medical and nursing teams from paediatric and adult cardiology services appears to achieve high engagement rates with the adult services. This approach allows a ‘face’ to be put on a named clinician delivering the adult service and should be encouraged.
A terrestrial (lacustrine and fluvial) palaeoclimate record from Hoxne (Suffolk, UK) shows two temperate phases separated by a cold episode, correlated with MIS 11 subdivisions corresponding to isotopic events 11.3 (Hoxnian interglacial period), 11.24 (Stratum C cold interval), and 11.23 (warm interval with evidence of human presence). A robust, reproducible multiproxy consensus approach validates and combines quantitative palaeotemperature reconstructions from three invertebrate groups (beetles, chironomids, and ostracods) and plant indicator taxa with qualitative implications of molluscs and small vertebrates. Compared with the present, interglacial mean monthly air temperatures were similar or up to 4.0°C higher in summer, but similar or as much as 3.0°C lower in winter; the Stratum C cold interval, following prolonged nondeposition or erosion of the lake bed, experienced summers 2.5°C cooler and winters between 5°C and 10°C cooler than at present. Possible reworking of fossils into Stratum C from underlying interglacial assemblages is taken into account. Oxygen and carbon isotopes from ostracod shells indicate evaporatively enriched lake water during Stratum C deposition. Comparative evaluation shows that proxy-based palaeoclimate reconstruction methods are best tested against each other and, if validated, can be used to generate more refined and robust results through multiproxy consensus.
To assess the incidence of colonization and infection with carbapenemase-producing Enterobacteriaceae (CPE) and carbapenem-resistant Acinetobacter baumannii (CR-Ab) in the ICUs of our city hospitals before and during the coronavirus disease 2019 (COVID-19) pandemic.
Methods:
We conducted a multicenter, before-and-after, cross-sectional study to compare the rates of colonization and infection with CPE and/or CR-Ab in 2 study periods, period 1 (January–April 2019) and period 2 (January–April 2020). Incidence rate ratios (IRRs) and 95% confidence intervals (CIs) of weekly colonization and infection rates for each period were compared for the 2 study periods using Poisson regression. Weekly trends in the incidence of colonization or infection for each study period were summarized using local weighted (Loess) regression.
Results:
We detected no significant change in either IRR and weekly trend in CPE colonization and infection during the 2 study periods. A shift from KPC to other CPE mechanisms (OXA-48 and VIM) was observed during period 2. Compared to period 1, during period 2 the IRR of colonization and infection with CR-Ab increased 7.5- and 5.5-fold, respectively. Genome sequencing showed that all CR-Ab strains belonged to the CC92/IC2 clonal lineage. Clinical strains clustered closely into a single monophyletic group in 1 of the 3 centers, whereas they segregated in 2 different clusters in the other 2 centers, which strongly indicates horizontal transmission.
Conclusions:
Our findings indicate the need to conduct infection control activities targeted against the spread of antimicrobial resistance between and within hospitals during the COVID-19 pandemic, and if necessary, remodulating them according to the new organizational structures imposed by the pandemic.
During the past decade, genetics research has allowed scientists and clinicians to explore the human genome in detail and reveal many thousands of common genetic variants associated with disease. Genetic risk scores, known as polygenic risk scores (PRSs), aggregate risk information from the most important genetic variants into a single score that describes an individual’s genetic predisposition to a given disease. This article reviews recent developments in the predictive utility of PRSs in relation to a person’s susceptibility to breast cancer and coronary artery disease. Prognostic models for these disorders are built using data from the UK Biobank, controlling for typical clinical and underwriting risk factors. Furthermore, we explore the possibility of adverse selection where genetic information about multifactorial disorders is available for insurance purchasers but not for underwriters. We demonstrate that prediction of multifactorial diseases, using PRSs, provides population risk information additional to that captured by normal underwriting risk factors. This research using the UK Biobank is in the public interest as it contributes to our understanding of predicting risk of disease in the population. Further research is imperative to understand how PRSs could cause adverse selection if consumers use this information to alter their insurance purchasing behaviour.
This study investigated metabolic, endocrine, appetite and mood responses to a maximal eating occasion in fourteen men (mean: age 28 (sd 5) years, body mass 77·2 (sd 6·6) kg and BMI 24·2 (sd 2·2) kg/m2) who completed two trials in a randomised crossover design. On each occasion, participants ate a homogenous mixed-macronutrient meal (pizza). On one occasion, they ate until ‘comfortably full’ (ad libitum) and on the other, until they ‘could not eat another bite’ (maximal). Mean energy intake was double in the maximal (13 024 (95 % CI 10 964, 15 084) kJ; 3113 (95 % CI 2620, 3605) kcal) compared with the ad libitum trial (6627 (95 % CI 5708, 7547) kJ; 1584 (95 % CI 1364, 1804) kcal). Serum insulin incremental AUC (iAUC) increased approximately 1·5-fold in the maximal compared with ad libitum trial (mean: ad libitum 43·8 (95 % CI 28·3, 59·3) nmol/l × 240 min and maximal 67·7 (95 % CI 47·0, 88·5) nmol/l × 240 min, P < 0·01), but glucose iAUC did not differ between trials (ad libitum 94·3 (95 % CI 30·3, 158·2) mmol/l × 240 min and maximal 126·5 (95 % CI 76·9, 176·0) mmol/l × 240 min, P = 0·19). TAG iAUC was approximately 1·5-fold greater in the maximal v. ad libitum trial (ad libitum 98·6 (95 % CI 69·9, 127·2) mmol/l × 240 min and maximal 146·4 (95 % CI 88·6, 204·1) mmol/l × 240 min, P < 0·01). Total glucagon-like peptide-1, glucose-dependent insulinotropic peptide and peptide tyrosine–tyrosine iAUC were greater in the maximal compared with ad libitum trial (P < 0·05). Total ghrelin concentrations decreased to a similar extent, but AUC was slightly lower in the maximal v. ad libitum trial (P = 0·02). There were marked differences on appetite and mood between trials, most notably maximal eating caused a prolonged increase in lethargy. Healthy men have the capacity to eat twice the energy content required to achieve comfortable fullness at a single meal. Postprandial glycaemia is well regulated following initial overeating, with elevated postprandial insulinaemia probably contributing.
Although death by neurologic criteria (brain death) is legally recognized throughout the United States, state laws and clinical practice vary concerning three key issues: (1) the medical standards used to determine death by neurologic criteria, (2) management of family objections before determination of death by neurologic criteria, and (3) management of religious objections to declaration of death by neurologic criteria. The American Academy of Neurology and other medical stakeholder organizations involved in the determination of death by neurologic criteria have undertaken concerted action to address variation in clinical practice in order to ensure the integrity of brain death determination. To complement this effort, state policymakers must revise legislation on the use of neurologic criteria to declare death. We review the legal history and current laws regarding neurologic criteria to declare death and offer proposed revisions to the Uniform Determination of Death Act (UDDA) and the rationale for these recommendations.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
METHODS:
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
RESULTS:
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
CONCLUSIONS:
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
METHODS:
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
RESULTS:
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
CONCLUSIONS:
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
METHODS:
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
RESULTS:
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
CONCLUSIONS:
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
In this study photo-identification data were used to better understand movements, population structure and abundance of common bottlenose dolphin (Tursiops truncatus) in south-west England and surrounding waters, to inform conservation efforts. A catalogue of 485 photographic sightings of 113 individuals was compiled from ~150 common bottlenose dolphin encounters made on 87 dates between March 2007 and January 2014. From these and other data, three likely sub-populations were identified in the western English Channel, demarcated by bathymetry and distance to land: (1) south-west England – inshore Cornwall to Devon, (2) offshore English/French waters and (3) inshore France from Brittany to Normandy. Maximum abundance estimates for south-west England coastal waters, using two methods, ranged between 102 and 113 (range 87–142, 95% CL) over the period 2008–2013, likely qualifying the region as nationally important, whilst the yearly maximum was 58 in 2013. The population was centred on Cornwall, where 19 well-marked animals were considered ‘probable’ residents. There were no ‘probable’ resident well-marked individuals found to be restricted to either Devon or Dorset, with animals moving freely within coastal areas across the three counties. Movements were also detected within offshore English waters and French waters (from other studies) of the western English Channel, but no interchange has as yet been detected between the three regions, highlighting the possible separation of the populations, though sample sizes are insufficient to confirm this. Given the findings, south-west England waters should be considered as a separate management unit requiring targeted conservation efforts.
An episode of postpartum psychosis can be devastating for a woman and her
family, and it is vital we understand the factors involved in the aetiology
of this condition. Sleep and circadian rhythm disruption is a plausible
candidate but further research is needed that builds on the latest advances
in chronobiology and neuroscience.
We report on a new class of materials for laser printer toner applications. These materials were prepared from methacrysilane-in-water emulsions stabilized with colloidal silica particles. In this elegant system, the colloidal silica particles reside at the water/oil interface helping to emulsify the oil droplet, self-organizing into a raspberry-like morphology. The emulsion formation is followed by free-radical polymerization, hydrophobic treatment, and drying steps. This one pot synthesis in water affords a hydrophobic material with a particle size in the range of 80 to 300 nm. The particle size could be fine-tuned by changing the oil-to-silica mass ratio or by using colloidal silica particles of different sizes. Results of material characterization by solid-state NMR, electron microscopy, and particle size measurements methods will be presented. Examples of possible extensions of the synthesis towards materials with methacrylsilane partially substituted with other methacrylates will be provided. Application of the new material in toners will be described as will the comparison of its performance with the incumbent material - hydrophobic colloidal silica.
The present study investigated the relationship between the milk protein content of a rehydration solution and fluid balance after exercise-induced dehydration. On three occasions, eight healthy males were dehydrated to an identical degree of body mass loss (BML, approximately 1·8 %) by intermittent cycling in the heat, rehydrating with 150 % of their BML over 1 h with either a 60 g/l carbohydrate solution (C), a 40 g/l carbohydrate, 20 g/l milk protein solution (CP20) or a 20 g/l carbohydrate, 40 g/l milk protein solution (CP40). Urine samples were collected pre-exercise, post-exercise, post-rehydration and for a further 4 h. Subjects produced less urine after ingesting the CP20 or CP40 drink compared with the C drink (P< 0·01), and at the end of the study, more of the CP20 (59 (sd 12) %) and CP40 (64 (sd 6) %) drinks had been retained compared with the C drink (46 (sd 9) %) (P< 0·01). At the end of the study, whole-body net fluid balance was more negative for trial C ( − 470 (sd 154) ml) compared with both trials CP20 ( − 181 (sd 280) ml) and CP40 ( − 107 (sd 126) ml) (P< 0·01). At 2 and 3 h after drink ingestion, urine osmolality was greater for trials CP20 and CP40 compared with trial C (P< 0·05). The present study further demonstrates that after exercise-induced dehydration, a carbohydrate–milk protein solution is better retained than a carbohydrate solution. The results also suggest that high concentrations of milk protein are not more beneficial in terms of fluid retention than low concentrations of milk protein following exercise-induced dehydration.
The microquasar GX 339-4 experienced an outburst in 2010. We focus on observations that are quasi-simultaneous with those made by INTEGRAL and RXTE in March–April 2010 with radio, NIR, optical and UV data. X-ray transients are extreme systems, often harboring a black hole, known to emit throughout the whole electromagnetic spectrum in outburst. We studied the source evolution and correlated changes in all wavelengths. The bolometric flux increased from 0.8 to 2.9 × 10−8 erg cm−2 s−1 while the relative contribution of the hot medium decreased. The radio, NIR and optical emission from jets was detected and observed to fade as the source softened; reprocessing in the disc was strong at the end.
Most accretion-powered relativistic jet sources in our Galaxy are transient X-ray binaries (XBs). Efforts to coordinate multiwavelength observations of these objects have improved dramatically over the last decade. Now the challenge is to interpret broadband spectral energy distributions (SEDs) of XBs that are well sampled in both wavelength and time. Here we focus on the evolution of the jet in their broadband spectra. Some of the most densely sampled broadband SEDs of a neutron star transient (IGR J00291+5934) are used to constrain the optically thick–thin break in the jet spectrum. For the black hole transient XTE J1550-564, infrared – X-ray correlations, evolution of broadband spectra and timing signatures indicate that synchrotron emission from the jet likely dominates the X-ray power law at low luminosities (~(2 × 10−4 − 2 × 10−3) LEdd) during the hard state outburst decline.