We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A primary barrier to translation of clinical research discoveries into care delivery and population health is the lack of sustainable infrastructure bringing researchers, policymakers, practitioners, and communities together to reduce silos in knowledge and action. As National Institutes of Healthʼs (NIH) mechanism to advance translational research, Clinical and Translational Science Award (CTSA) awardees are uniquely positioned to bridge this gap. Delivering on this promise requires sustained collaboration and alignment between research institutions and public health and healthcare programs and services. We describe the collaboration of seven CTSA hubs with city, county, and state healthcare and public health organizations striving to realize this vision together. Partnership representatives convened monthly to identify key components, common and unique themes, and barriers in academic–public collaborations. All partnerships aligned the activities of the CTSA programs with the needs of the city/county/state partners, by sharing resources, responding to real-time policy questions and training needs, promoting best practices, and advancing community-engaged research, and dissemination and implementation science to narrow the knowledge-to-practice gap. Barriers included competing priorities, differing timelines, bureaucratic hurdles, and unstable funding. Academic–public health/health system partnerships represent a unique and underutilized model with potential to enhance community and population health.
Introduction: Les patients ayant un retour de circulation spontanée (RCS) durant la phase préhospitalière de leur réanimation suite à un arrêt cardiaque extrahospitalier (ACEH) ont un meilleur taux de survie que ceux n'en ayant pas. La durée des efforts de réanimation avant l'initiation d'un transport ne varie généralement pas en fonction du rythme initial observé. Cette étude vise à comparer la durée des manœuvres de réanimation nécessaire afin de générer la majorité des RCS préhospitaliers et des RCS préhospitaliers menant à une survie en fonction du rythme initial. Methods: La présente étude de cohorte a été réalisée à partir des bases de données collectées de la Corporation d'Urgences-santé dans la région de Montréal entre 2010 et 2015. Les patients avec un ACEH d'origine médicale ont été inclus. Les patients dont l'ACEH était témoigné par les paramédics ont été exclus, tout comme ceux dont le rythme initial était inconnu. Nous avons comparé entre les groupes (rythme défibrillable [RD], activité électrique sans pouls [AESP] et asystolie) les taux de RCS préhospitalier et le temps nécessaires pour obtenir une majorité des RCS préhospitaliers et des RCS préhospitaliers menant à une survie. Results: Un total de 6002 patients (3851 hommes et 2151 femmes) d'un âge moyen de 52 ans ( ±10) ont été inclus dans l’étude, parmi lesquels 563 (9%) ont survécu jusqu’à leur congé hospitalier et 1310 (22%) ont obtenu un RCS préhospitalier. Un total de 1545 (26%) patients avaient un RD, 1654 (28%) une AESP et 2803 (47%) une asystolie. Les patients avec un RD ont obtenu plus fréquemment un RCS préhospitalier et un RCS préhospitalier menant à une survie que les patients avec une AESP qui eux même avaient un meilleur pronostic que ceux avec une asystolie initial (777 patients [55%] vs 385 [23%] vs 148 [5%], p < 0,001; 431 [28%] vs 85 [5%] vs 7 [0,2%], p < 0,001, respectivement). Les RCS survenaient également plus rapidement lorsque le rythme initial était un RD (13 minutes [ ±12] vs 18 [ ±13] vs 25 [ ±12], p < 0,001). Cependant, une période de réanimation plus longue était nécessaire afin d'obtenir 95% des RCS préhospitaliers menant à une survie pour les patients avec un RD (26 minutes vs 21 minutes vs 21 minutes). Conclusion: Les patients avec un rythme initial défibrillable suite à leur ACEH sont à meilleur pronostic. Il serait envisageable de transporter plus rapidement vers l'hôpital les patients avec une AESP ou une asystolie que ceux avec un rythme défibrillable si l'arrêt des manœuvres n'est pas envisagé.
Introduction: Les patients dont l'arrêt cardiaque extrahospitalier (ACEH) n'a pas été témoigné sont généralement exclus des protocoles de réanimation par circulation extracorporelle puisque le délai avant l'initiation de leur réanimation est inconnu. Il a été proposé que la présence d'un rythme initial défibrillable (RD) est fortement suggestif une très courte période avant l'initiation des manœuvres de réanimation. La présente étude vise à décrire l'association entre la durée avant l'initiation de la réanimation et la présence d'un RD chez des patients souffrant d'un ACEH. Methods: Cette étude de cohorte a été réalisée à partir des bases de données collectées de la Corporation d'Urgences-santé dans la région de Montréal entre 2010 et 2015. Les patients dont l'arrêt était témoigné, mais dont les témoins n'ont pas entamé de manœuvres de réanimation, ont été inclus. Nous avons également inclus les patients dont l'arrêt était témoigné par les paramédics comme groupe contrôle (durée avant l'initiation de la réanimation = 0 minute). Les patients avec un retour de circulation spontanée avant l'arrivée des services préhospitaliers ont été exclus, tout comme ceux dont le rythme initial était inconnu. Nous avons décrit l’évolution de la proportion de chacun des rythmes et construit une régression logistique multivariée ajustant pour les variables sociodémographiques et cliniques pertinentes. Results: Un total de 1751 patients (1173 hommes et 578 femmes) d'un âge moyen de 69 ans (±16) ont été inclus dans l'analyse principale, parmi lesquels 603 (34%) avaient un RD. Un total de 663 autres patients ont vu leur ACEH témoigné directement par les paramédics. Un plus court délai avant l'initiation des manœuvres est associé à la présence d'un RD (rapport de cotes ajusté = 0,97 [intervalle de confiance à 95% 0,94-0,99], p = 0,016). Cependant, cette relation n'est pas linéaire et la proportion de RD ne diminue pas avant notablement jusqu’à ce que 15 minutes s’écoulent avant le début de la réanimation (0 min = 35%, 1-5 min = 37%, 5-10 min = 35%, 10-15 min = 34%, +de 15 min = 16%). Conclusion: Bien que la proportion de patients avec un RD diminue lorsque le délai augmente avant l'initiation des manœuvres, cette relation ne semble pas linéaire. La baisse principale de la proportion de patients avec RD semble se produire suite à la quinzième minute de délai avant le début de la réanimation.
Introduction: La réanimation par circulation extracorporelle (R-CEC) permet potentiellement d'améliorer la survie de patients souffrant d'un arrêt cardiaque extrahospitalier (ACEH) réfractaire aux traitements habituels. Cette technique, se pratiquant généralement en centre hospitalier (CH), doit être réalisée le plus précocement possible. Un transport vers le CH en temps opportun est donc nécessaire. Cette étude vise à décrire la durée nécessaire des manœuvres de réanimation préhospitalières afin d'optimiser le moment du départ vers le CH dans le but d'obtenir un maximum de retour de circulation spontanée (RCS) préhospitalier. Methods: La présente étude de cohorte a été réalisée à partir des bases de données collectées de la Corporation d'Urgences-santé dans la région de Montréal entre 2010 et 2015. Les patients éligibles à une R-CEC selon les critères locaux ont été inclus (<65 ans, rythme initial défibrillable, arrêt témoigné avec réanimation par un témoin). Les patients ayant eu un arrêt devant les paramédics ont été exclus, tout comme ceux avec un RCS avant l'arrivée des services préhospitaliers. Nous avons calculé la sensibilité et la spécificité à différents seuils afin de prédire un RCS préhospitalier et une survie au congé hospitalier. Une courbe ROC a également été construite. Results: Un total de 236 patients (207 hommes et 29 femmes) d'un âge moyen de 52 ans (±10) ont été inclus dans l’étude, parmi lesquels 93 (39%) ont survécu jusqu’à leur congé hospitalier et 136 (58%) ont obtenu un RCS préhospitalier. Le délai moyen avant leur RCS était de 13 minutes (±10). Plus de 50% des survivants avaient eu un RCS moins de 8 minutes après l'initiation des manœuvres de réanimation par les intervenants préhospitaliers, et plus de 90% avant 24 minutes. Plus de 50% de tous les RCS survenaient dans les 10 premières minutes de réanimation et plus de 90% dans les 31 premières minutes. La courbe ROC montrait visuellement que le délai avant le RCS maximisant la sensibilité et la spécificité pour prédire la survie chez ces patients était à 22 minutes (Sensibilité = 90%, spécificité = 78%; aire sous la courbe = 0,89 [intervalle de confiance à 95% 0,84-0,93]). Conclusion: Le départ vers le CH pourrait être considéré pour ces patients entre 8 et 24 minutes après l'initiation des manœuvres. Une période de réanimation de 22 minutes semble être le meilleur compromis à cet égard.
Introduction: Parmi les patients souffrant d'un arrêt cardiaque extrahospitalier (ACEH), ceux ayant un retour de circulation spontanée (RCS) durant la phase préhospitalière de leur réanimation ont un meilleur taux de survie. Il est plausible que les patients ayant un RCS plus précocement durant leur réanimation préhospitalière aient de meilleur taux de survie que les patients ayant un RCS plus tardif. Cette étude visait à décrire l'association entre la survie et la durée de la réanimation par les paramédics avant le RCS préhospitalier. Methods: La présente étude de cohorte a été réalisée à partir des bases de données collectées de la Corporation d'Urgences-santé dans la région de Montréal entre 2010 et 2015. Tous les patients adultes avec un RCS préhospitalier suite à un ACEH d'origine médicale ont été inclus. Les patients ayant eu un arrêt devant les paramédics ont été exclus, tout comme ceux avec un RCS avant l'arrivée des services préhospitaliers. L'association entre la survie et le temps de réanimation avant le RCS a été évaluée à l'aide d'une régression logistique multivariée ajustant pour les variables sociodémographiques et cliniques pertinentes (âge, sexe, rythme initial, heure de l'appel initial, arrêt témoigné, manœuvre par témoin, présence de premiers répondants ou de paramédics de soins avancés, délai avant l'arrivée des intervenants préhospitaliers). Results: Un total de 1194 patients (818 hommes et 376 femmes) d'un âge moyen de 64 ans ( ±17) ont été inclus dans l’étude, parmi lesquels 433 (36%) ont survécu jusqu’à leur congé hospitalier. Le délai moyen avant leur RCS était de 17 minutes ( ±12). Nous avons observé une association indépendante entre la survie au congé hospitalier et le délai avant le RCS préhospitalier (rapport de cotes ajusté = 0,91 [intervalle de confiance à 95% 0,89-0,92], p < 0,001). Plus de 50% des survivants avaient obtenu un RCS moins de 9 minutes après l'initiation des manœuvres de réanimation par les intervenants préhospitaliers, et plus de 95% avant 26 minutes. Aucun (0%) des 17 patients ayant eu un RCS plus de 56 minutes après l'initiation de la réanimation préhospitalière n'a survécu. Conclusion: Un RCS précoce semble être un facteur de bon pronostic parmi les patients souffrant d'un ACEH. La majorité des patients avec un RCS préhospitalier allant survivre à leur hospitalisation ont obtenus leur RCS dans les 9 minutes suivant l'initiation des manœuvres de réanimation.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
Childhood early life stress (ELS) increases risk of adulthood major depressive disorder (MDD) and is associated with altered brain structure and function. It is unclear whether specific ELSs affect depression risk, cognitive function and brain structure.
Method
This cross-sectional study included 64 antidepressant-free depressed and 65 never-depressed individuals. Both groups reported a range of ELSs on the Early Life Stress Questionnaire, completed neuropsychological testing and 3T magnetic resonance imaging (MRI). Neuropsychological testing assessed domains of episodic memory, working memory, processing speed and executive function. MRI measures included cortical thickness and regional gray matter volumes, with a priori focus on the cingulate cortex, orbitofrontal cortex (OFC), amygdala, caudate and hippocampus.
Results
Of 19 ELSs, only emotional abuse, sexual abuse and severe family conflict independently predicted adulthood MDD diagnosis. The effect of total ELS score differed between groups. Greater ELS exposure was associated with slower processing speed and smaller OFC volumes in depressed subjects, but faster speed and larger volumes in non-depressed subjects. In contrast, exposure to ELSs predictive of depression had similar effects in both diagnostic groups. Individuals reporting predictive ELSs exhibited poorer processing speed and working memory performance, smaller volumes of the lateral OFC and caudate, and decreased cortical thickness in multiple areas including the insula bilaterally. Predictive ELS exposure was also associated with smaller left hippocampal volume in depressed subjects.
Conclusions
Findings suggest an association between childhood trauma exposure and adulthood cognitive function and brain structure. These relationships appear to differ between individuals who do and do not develop depression.
Maternal diet-induced obesity can cause detrimental developmental origins of health and disease in offspring. Perinatal exposure to a high-fat diet (HFD) can lead to later behavioral and metabolic disturbances, but it is not clear which behaviors and metabolic parameters are most vulnerable. To address this critical gap, biparental and monogamous oldfield mice (Peromyscus polionotus), which may better replicate most human societies, were used in the current study. About 2 weeks before breeding, adult females were placed on a control or HFD and maintained on the diets throughout gestation and lactation. F1 offspring were placed at weaning (30 days of age) on the control diet and spatial learning and memory, anxiety, exploratory, voluntary physical activity, and metabolic parameters were tested when they reached adulthood (90 days of age). Surprisingly, maternal HFD caused decreased latency in initial and reverse Barnes maze trials in male, but not female, offspring. Both male and female HFD-fed offspring showed increased anxiogenic behaviors, but decreased exploratory and voluntary physical activity. Moreover, HFD offspring demonstrated lower resting energy expenditure (EE) compared with controls. Accordingly, HFD offspring weighed more at adulthood than those from control fed dams, likely the result of reduced physical activity and EE. Current findings indicate a maternal HFD may increase obesity susceptibility in offspring due to prenatal programming resulting in reduced physical activity and EE later in life. Further work is needed to determine the underpinning neural and metabolic mechanisms by which a maternal HFD adversely affects neurobehavioral and metabolic pathways in offspring.
To develop a candidate definition for central line–associated bloodstream infection (CLABSI) in neonates with presumed mucosal barrier injury due to gastrointestinal (MBI-GI) conditions and to evaluate epidemiology and microbiology of MBI-GI CLABSI in infants
Design.
Multicenter retrospective cohort study.
Setting.
Neonatal intensive care units from 14 US children’s hospitals and pediatric facilities.
Methods.
A multidisciplinary focus group developed a candidate MBI-GI CLABSI definition based on presence of an MBI-GI condition, parenteral nutrition (PN) exposure, and an eligible enteric organism. CLABSI surveillance data from participating hospitals were supplemented by chart review to identify MBI-GI conditions and PN exposure.
Results.
During 2009–2012, 410 CLABSIs occurred in 376 infants. MBI-GI conditions and PN exposure occurred in 149 (40%) and 324 (86%) of these 376 neonates, respectively. The distribution of pathogens was similar among neonates with versus without MBI-GI conditions and PN exposure. Fifty-nine (16%) of the 376 initial CLABSI episodes met the candidate MBI-GI CLABSI definition. Subsequent versus initial CLABSIs were more likely to be caused by an enteric organism (22 of 34 [65%] vs 151 of 376 [40%]; P = .009) and to meet the candidate MBI-GI CLABSI definition (19 of 34 [56%] vs 59 of 376 [16%]; P < .01).
Conclusions.
While MBI-GI conditions and PN exposure were common, only 16% of initial CLABSIs met the candidate definition of MBI-GI CLABSI. The high proportion of MBI-GI CLABSIs among subsequent infections suggests that infants with MBI-GI CLABSI should be a population targeted for further surveillance and interventional research.
Infect Control Hosp Epidemiol 2014;35(11):1391–1399
We surveyed US and Canadian pediatric hospitals about their use of central line-associated bloodstream infection (CLABSI) prevention strategies beyond typical insertion and maintenance bundles. We found wide variation in supplemental strategies across hospitals and in their penetration within hospitals. Future studies should assess specific adjunctive prevention strategies and CLABSI rates.
Countries of the Wider Caribbean have committed to principled ocean governance through several multilateral environmental and fisheries agreements at both the regional (e.g., the Cartagena Convention SPAW Protocol) and international level (e.g., the Convention on Biological Diversity, the United Nations Fish Stocks Agreement, the FAO Code of Conduct for Responsible Fishing). They have also committed to the 2002 World Summit on Sustainable Development (WSSD) targets for fisheries and biodiversity conservation. However, the ongoing challenge is to put in place the measures required to give effect to these principles at the local, national and regional levels (Fanning et al. 2009). While not minimising the important role of science in an ecosystem approach to managing the living marine resources of the Wider Caribbean Region, the chapters in this book serve to highlight the importance that regional experts have placed on the role of governance to address the problems in the region.
This synthesis chapter presents the outputs of a discussion specifically relating to the role of governance in achieving and implementing a shared vision for ecosystem-based management (EBM) in the Wider Caribbean, using the process described in Chapter 1. In terms of structure, the chapter first describes a vision for governance and reports on the priorities assigned to the identified vision elements. It then discusses how the vision might be achieved by taking into account assisting factors (those that facilitate achievement) and resisting factors (those that inhibit achievement). The chapter concludes with guidance on the strategic direction needed to implement the vision, identifying specific actions to be undertaken for each of the vision elements.
The Vision
The occupational breakdown of members of the Governance Working Group reflected the diversity of affiliations present at the EBM Symposium and included governmental, intergovernmental, academic, non-governmental and private sector (fishers and fishing industry and consulting) representatives. With guidance provided by the facilitator, this diverse grouping of participants was asked to first address the question: “What do you see in place in 10 years’ time when EBM/EAF has become a reality in the Caribbean?”. This diversity provided for a fruitful and comprehensive visioning process, the results of which are summarised in Table 25.1, in terms of the key vision elements and their subcomponents, and in Figure 25.1, which illustrates the level of priority assigned to each of the vision elements.
Three different types of bivalent influenza virus vaccine, a whole virus, an aqueous-surface-antigen vaccine and an adsorbed-surface-antigen vaccine were tested at three dosage levels in volunteers primed with respect to only one of the haemagglutinin antigens present in the vaccines.
The local and systemic reactions to all three vaccine types were mild in nature and, following first immunization, the aqueous-surface-antigen vaccine was the least reactogenic. The serum haemagglutination-inhibiting antibody response to the A/Victoria/75 component of the vaccines, to which the volunteer population was primed, was greatest following immunization with the aqueous-surface-antigen vaccine; the greatest antibody response to the A/New Jersey/76 component of the vaccines was observed following immunization with whole virus vaccine.
One hundred and thirty men and women attending psychiatric hospitals with depressive disorders were interviewed at the time of their initial contact. After a mean four month interval, 119 were reassessed in order to test the hypothesis that initial levels of social support predict clinical improvement even when other potential risk factors such as age, sex, diagnosis and severity of depression are controlled. Severity and duration of the episode emerged as the only significant background predictors of recovery. The explained variance in recovery from depression due to social support was equal in men and women, and was not diminished by the background clinical predictors. According to subset analyses however, the aspects of personal relationships and perceived support that predict recovery in men and in women appear to be different. The available multiple regression models of outcome favoured a main effect of social support and provided persuasive if inconclusive evidence for a statistical interaction effect with sex. The implications for further research and for theory are discussed.
Serum and urine samples from seven recombinant inbred mouse strains, derived from a cross between BALB/c and C57BL/6, were examined to determine the immunoglobulin heavy chain (IgCH) and the major urinary protein (MUP) allotypes. CXBG and CXBJ exhibited the same IgCH alleles as did BALB/c; the others resembled C57BL/6, thus providing no evidence of crossover types. Comparison of the Mup and brown coat colour (b) alleles (both on linkage group VIII) revealed that three of the strains resemble BALB/c and two resemble C57BL/6, whereas the CXBE and CXBI strains are crossover types.
In [3] Fuller introduced an index (now called the Fuller index) in order to study periodic solutions of ordinary differential equations. The objective of this paper is to give a simple generalisation of the Fuller index which can be used to study periodic points of flows in Banach spaces. We do not claim any significant breakthrough but merely suggest that the simplistic approach, presented here, might prove useful for the study of non-linear differential equations. We show our results can be used to study functional differential equations.
The abstract theory of positive compact operators (acting in a partially ordered Banach space) has proved to be particularly useful in the theory of integral equations. In a recent paper (2) it was shown that many of the now classical theorems for positive compact operators can be extended to certain classes of non-compact operators. One result, proved in (2, Theorem 5), was a fixed point theorem for compressive k-set contractions (k<l). The main result of this paper (Theorem 3.3) shows that some of the hypotheses of (2, Theorem 5) are unnecessary. We use techniques based on those used by M. A. Krasnoselskii in the proof of Theorem 4.12 in (4), which is the classical fixed point theorem for compressive compact operators, to obtain a complete generalisation of this classical result to the k-set contractions (k < 1). It should be remarked that J. D. Hamilton has extended the same result to A-proper mappings (3, Theorem 1). However apparently it is not known, even in the case when we are dealing with a Π1-space, whether k-set contractions are A-proper or not.
Our aim was to measure the effects of irradiation treatments on soil communities from three different soils. Undisturbed soil cores from three temperate sites (deciduous and coniferous woodland and grassland) were irradiated to give cumulative doses from 0-160 Gy. Cores were incubated at 15$^{\circ}$C and three cores from each treatment sampled after <1, 3 and 8 d. Soil fungi and heterotrophic bacteria were enumerated and the activity and functional diversity of soil microbial communities assessed in terms of their potential to utilise a range of C-sources. Although no significant treatment effects were observed in the numbers of cultivable fungi or fast growing heterotrophic bacteria, the numbers of cultivable Pseudomonas spp. declined in all three soils after irradiation at 80 and 160 Gy. Microbial communities from the coniferous forest soil also showed a dramatic decrease in the metabolic activity and number of substrates utilised by after irradiation at 160 Gy. Gamma irradiation had a greater affect on microbial communities in the two organic forest soils as compared to the mineral grassland soil, this could be related to variations in the physico-chemical shielding properties and in the indigenous communities in terms of radio-resistant species.