We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Many paediatric studies report that patients must be established on aspirin therapy for a minimum of 5 days to achieve adequate response. This is not always practical especially in critical settings. Prospective identification of patients that are unresponsive to aspirin sooner could potentially prevent thrombotic events.
Aims:
The aim of this study was to investigate prospectively if the first dose of aspirin is effective in decreasing platelet aggregation, and thromboxane formation and if this can be measured after 2 hours in paediatric cardiology patients. A secondary aim was to identify a cut-off for a novel marker of aspirin responsiveness the maximum amplitude with arachidonic acid, which could potentially dramatically reduce the blood volume required. Third, we aimed to prospectively identify potentially non-responsive patients by spiking a sample of their blood ex vivo with aspirin.
Results:
The majority (92.3%) of patients were responsive, when measured 2 hours post first dose of aspirin. Non-response or inadequate response (7.7%) can also be identified at 2 hours after taking the first dose of aspirin. Additionally, we have shown a novel way to reduce blood sample volume requirements by measurement of the maximum amplitude with arachidonic acid as a marker of response, particularly for monitoring.
Conclusions:
These findings of rapid efficacy in the majority of patients offer assurance in a sound, practical way to attending clinicians, patients, and families.
Older adults have low levels of mental health literacy relating to anxiety which may contribute to delaying or not seeking help. Lifestyle interventions, including physical activity (PA), have increasing evidence supporting their effectiveness in reducing anxiety. The COVID-19 pandemic also highlighted the potential for technology to facilitate healthcare provision. This study aimed to investigate perspectives of older adults about their understanding of anxiety, possible use of PA interventions to reduce anxiety, and whether technology could help this process.
Methods:
The INDIGO trial evaluated a PA intervention for participants aged 60 years and above at risk of cognitive decline and not meeting PA guidelines. Twenty-nine of the INDIGO trial completers, including some with anxiety and/or cognitive symptoms, attended this long-term follow-up study including semi-structured qualitative interviews. Transcripts were analyzed thematically.
Results:
There was quite a diverse understanding of anxiety amongst participants. Some participants were able to describe anxiety as involving worry, uncertainty and fear, as well as relating it to physical manifestations and feeling out of control. Others had less understanding of the concept of anxiety or found it confusing. Participants generally believed that PA could potentially reduce anxiety and thought that this could occur through a “mindfulness” and/or “physiological” process. Technology use was a more controversial topic with some participants quite clearly expressing a dislike or distrust of technology or else limited access or literacy in relation to technology. Participants who were supportive of using technology described that it could help with motivation, information provision and health monitoring. Wearable activity monitors were described favorably, with online platforms and portable devices also being options.
Conclusion:
Our results highlight the importance of increasing information and education about anxiety to older adults. This may increase awareness of anxiety and reduce delays in seeking help or not seeking help at all. Findings also emphasize the need for clinicians to support understanding of anxiety in older adults that they are seeing and provide information and education where needed. It is likely that PA interventions to reduce anxiety, with the option of a technology component with support, will be acceptable to most older adults.
This paper provides an overview and appraisal of the International Design Engineering Annual (IDEA) challenge - a virtually hosted design hackathon run with the aim of generating a design research dataset that can provide insights into design activities at virtually hosted hackathons. The resulting dataset consists of 200+ prototypes with over 1300 connections providing insights into the products, processes and people involved in the design process. The paper also provides recommendations for future deployments of virtual hackathons for design research.
Psychotic-like experiences (PLEs) are risk factors for the development of psychiatric conditions like schizophrenia, particularly if associated with distress. As PLEs have been related to alterations in both white matter and cognition, we investigated whether cognition (g-factor and processing speed) mediates the relationship between white matter and PLEs.
Methods
We investigated two independent samples (6170 and 19 891) from the UK Biobank, through path analysis. For both samples, measures of whole-brain fractional anisotropy (gFA) and mean diffusivity (gMD), as indications of white matter microstructure, were derived from probabilistic tractography. For the smaller sample, variables whole-brain white matter network efficiency and microstructure were also derived from structural connectome data.
Results
The mediation of cognition on the relationships between white matter properties and PLEs was non-significant. However, lower gFA was associated with having PLEs in combination with distress in the full available sample (standardized β = −0.053, p = 0.011). Additionally, lower gFA/higher gMD was associated with lower g-factor (standardized β = 0.049, p < 0.001; standardized β = −0.027, p = 0.003), and partially mediated by processing speed with a proportion mediated of 7% (p = < 0.001) for gFA and 11% (p < 0.001) for gMD.
Conclusions
We show that lower global white matter microstructure is associated with having PLEs in combination with distress, which suggests a direction of future research that could help clarify how and why individuals progress from subclinical to clinical psychotic symptoms. Furthermore, we replicated that processing speed mediates the relationship between white matter microstructure and g-factor.
Due to shortages of N95 respirators during the coronavirus disease 2019 (COVID-19) pandemic, it is necessary to estimate the number of N95s required for healthcare workers (HCWs) to inform manufacturing targets and resource allocation.
Methods:
We developed a model to determine the number of N95 respirators needed for HCWs both in a single acute-care hospital and the United States.
Results:
For an acute-care hospital with 400 all-cause monthly admissions, the number of N95 respirators needed to manage COVID-19 patients admitted during a month ranges from 113 (95% interpercentile range [IPR], 50–229) if 0.5% of admissions are COVID-19 patients to 22,101 (95% IPR, 5,904–25,881) if 100% of admissions are COVID-19 patients (assuming single use per respirator, and 10 encounters between HCWs and each COVID-19 patient per day). The number of N95s needed decreases to a range of 22 (95% IPR, 10–43) to 4,445 (95% IPR, 1,975–8,684) if each N95 is used for 5 patient encounters. Varying monthly all-cause admissions to 2,000 requires 6,645–13,404 respirators with a 60% COVID-19 admission prevalence, 10 HCW–patient encounters, and reusing N95s 5–10 times. Nationally, the number of N95 respirators needed over the course of the pandemic ranges from 86 million (95% IPR, 37.1–200.6 million) to 1.6 billion (95% IPR, 0.7–3.6 billion) as 5%–90% of the population is exposed (single-use). This number ranges from 17.4 million (95% IPR, 7.3–41 million) to 312.3 million (95% IPR, 131.5–737.3 million) using each respirator for 5 encounters.
Conclusions:
We quantified the number of N95 respirators needed for a given acute-care hospital and nationally during the COVID-19 pandemic under varying conditions.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
Measles is a notifiable disease, but not everyone infected seeks care, nor is every consultation reported. We estimated the completeness of reporting during a measles outbreak in The Netherlands in 2013–2014. Children below 15 years of age in a low vaccination coverage community (n = 3422) received a questionnaire to identify measles cases. Cases found in the survey were matched with the register of notifiable diseases to estimate the completeness of reporting. Second, completeness of reporting was assessed by comparing the number of susceptible individuals prior to the outbreak with the number of reported cases in the surveyed community and on a national level.
We found 307 (15%) self-identified measles cases among 2077 returned questionnaires (61%), of which 27 could be matched to a case reported to the national register; completeness of reporting was 8.8%. Based on the number of susceptible individuals and number of reported cases in the surveyed community and on national level, the completeness of reporting was estimated to be 9.1% and 8.6%, respectively. Estimating the completeness of reporting gave almost identical estimates, which lends support to the credibility and validity of both approaches. The size of the 2013–2014 outbreak approximated 31 400 measles infections.
Recent evidence suggests that exercise plays a role in cognition and that the posterior cingulate cortex (PCC) can be divided into dorsal and ventral subregions based on distinct connectivity patterns.
Aims
To examine the effect of physical activity and division of the PCC on brain functional connectivity measures in subjective memory complainers (SMC) carrying the epsilon 4 allele of apolipoprotein E (APOE 4) allele.
Method
Participants were 22 SMC carrying the APOE ɛ4 allele (ɛ4+; mean age 72.18 years) and 58 SMC non-carriers (ɛ4–; mean age 72.79 years). Connectivity of four dorsal and ventral seeds was examined. Relationships between PCC connectivity and physical activity measures were explored.
Results
ɛ4+ individuals showed increased connectivity between the dorsal PCC and dorsolateral prefrontal cortex, and the ventral PCC and supplementary motor area (SMA). Greater levels of physical activity correlated with the magnitude of ventral PCC–SMA connectivity.
Conclusions
The results provide the first evidence that ɛ4+ individuals at increased risk of cognitive decline show distinct alterations in dorsal and ventral PCC functional connectivity.
Introduction: Point of care ultrasound (PoCUS) has become an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). Current established protocols (e.g. RUSH and ACES) were developed by expert user opinion, rather than objective, prospective data. Recently the SHoC Protocol was published, recommending 3 core scans; cardiac, lung, and IVC; plus other scans when indicated clinically. We report the abnormal ultrasound findings from our international multicenter randomized controlled trial, to assess if the recommended 3 core SHoC protocol scans were chosen appropriately for this population. Methods: Recruitment occurred at seven centres in North America (4) and South Africa (3). Screening at triage identified patients (SBP<100 or shock index>1) who were randomized to PoCUS or control (standard care with no PoCUS) groups. All scans were performed by PoCUS-trained physicians within one hour of arrival in the ED. Demographics, clinical details and study findings were collected prospectively. A threshold incidence for positive findings of 10% was established as significant for the purposes of assessing the appropriateness of the core recommendations. Results: 138 patients had a PoCUS screen completed. All patients had cardiac, lung, IVC, aorta, abdominal, and pelvic scans. Reported abnormal findings included hyperdynamic LV function (59; 43%); small collapsing IVC (46; 33%); pericardial effusion (24; 17%); pleural fluid (19; 14%); hypodynamic LV function (15; 11%); large poorly collapsing IVC (13; 9%); peritoneal fluid (13; 9%); and aortic aneurysm (5; 4%). Conclusion: The 3 core SHoC Protocol recommendations included appropriate scans to detect all pathologies recorded at a rate of greater than 10 percent. The 3 most frequent findings were cardiac and IVC abnormalities, followed by lung. It is noted that peritoneal fluid was seen at a rate of 9%. Aortic aneurysms were rare. This data from the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients, supports the use of the prioritized SHoC protocol, though a larger study is required to confirm these findings.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.
Introduction: Point of care ultrasonography (PoCUS) is an established tool in the initial management of hypotensive patients in the emergency department (ED). It has been shown rule out certain shock etiologies, and improve diagnostic certainty, however evidence on benefit in the management of hypotensive patients is limited. We report the findings from our international multicenter RCT assessing the impact of a PoCUS protocol on diagnostic accuracy, as well as other key outcomes including mortality, which are reported elsewhere. Methods: Recruitment occurred at 4 North American and 3 Southern African sites. Screening at triage identified patients (SBP<100 mmHg or shock index >1) who were randomized to either PoCUS or control groups. Scans were performed by PoCUS-trained physicians. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. Final chart review was blinded to initial impressions and PoCUS findings. Categorical data was analyzed using Fishers two-tailed test. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. The perceived shock category changed more frequently in the PoCUS group 20/127 (15.7%) vs. control 7/125 (5.6%); RR 2.81 (95% CI 1.23 to 6.42; p=0.0134). There was no significant difference in change of diagnostic impression between groups PoCUS 39/123 (31.7%) vs control 34/124 (27.4%); RR 1.16 (95% CI 0.786 to 1.70; p=0.4879). There was no significant difference in the rate of correct category of shock between PoCUS (118/127; 93%) and control (113/122; 93%); RR 1.00 (95% CI 0.936 to 1.08; p=1.00), or for correct diagnosis; PoCUS 90/127 (70%) vs control 86/122 (70%); RR 0.987 (95% CI 0.671 to 1.45; p=1.00). Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We found that the use of PoCUS did change physicians’ perceived shock category. PoCUS did not improve diagnostic accuracy for category of shock or diagnosis.
Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate.
Method
The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009.
Results
There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status.
Conclusions
Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.
Understanding the spatial distribution of disease is critical for effective disease control. Where formal address networks do not exist, tracking spatial patterns of clinical disease is difficult. Geolocation strategies were tested at rural health facilities in western Kenya. Methods included geocoding residence by head of compound, participatory mapping and recording the self-reported nearest landmark. Geocoding was able to locate 72·9% [95% confidence interval (CI) 67·7–77·6] of individuals to within 250 m of the true compound location. The participatory mapping exercise was able to correctly locate 82·0% of compounds (95% CI 78·9–84·8) to a 2 × 2·5 km area with a 500 m buffer. The self-reported nearest landmark was able to locate 78·1% (95% CI 73·8–82·1) of compounds to the correct catchment area. These strategies tested provide options for quickly obtaining spatial information on individuals presenting at health facilities.
We investigated whether straight-line distance from residential compounds to healthcare facilities influenced mortality, the incidence of pneumonia and vaccine efficacy against pneumonia in rural Gambia. Clinical surveillance for pneumonia was conducted on 6938 children living in the catchment areas of the two largest healthcare facilities. Deaths were monitored by three-monthly home visits. Children living >5 km from the two largest healthcare facilities had a 2·78 [95% confidence interval (CI) 1·74–4·43] times higher risk of all-cause mortality compared to children living within 2 km of these facilities. The observed rate of clinical and radiological pneumonia was lower in children living >5 km from these facilities compared to those living within 2 km [rate ratios 0·65 (95% CI 0·57–0·73) and 0·74 (95% CI 0·55–0·98), respectively]. There was no association between distance and estimated pneumococcal vaccine efficacy. Geographical access to healthcare services is an important determinant of survival and pneumonia in children in rural Gambia.
It has been postulated that aging is the consequence of an accelerated accumulation of somatic DNA mutations and that subsequent errors in the primary structure of proteins ultimately reach levels sufficient to affect organismal functions. The technical limitations of detecting somatic changes and the lack of insight about the minimum level of erroneous proteins to cause an error catastrophe hampered any firm conclusions on these theories. In this study, we sequenced the whole genome of DNA in whole blood of two pairs of monozygotic (MZ) twins, 40 and 100 years old, by two independent next-generation sequencing (NGS) platforms (Illumina and Complete Genomics). Potentially discordant single-base substitutions supported by both platforms were validated extensively by Sanger, Roche 454, and Ion Torrent sequencing. We demonstrate that the genomes of the two twin pairs are germ-line identical between co-twins, and that the genomes of the 100-year-old MZ twins are discerned by eight confirmed somatic single-base substitutions, five of which are within introns. Putative somatic variation between the 40-year-old twins was not confirmed in the validation phase. We conclude from this systematic effort that by using two independent NGS platforms, somatic single nucleotide substitutions can be detected, and that a century of life did not result in a large number of detectable somatic mutations in blood. The low number of somatic variants observed by using two NGS platforms might provide a framework for detecting disease-related somatic variants in phenotypically discordant MZ twins.
We present the first sample of diffuse interstellar bands (DIBs) in the nearby galaxy M33. Studying DIBs in other galaxies allows the behaviour of the carriers to be examined under interstellar conditions which can be quite different from those of the Milky Way, and to determine which DIB properties can be used as reliable probes of extragalactic interstellar media. Multi-object spectroscopy of 43 stars in M33 has been performed using Keck/DEIMOS. The stellar spectral types were determined and combined with literature photometry to determine the M33 reddenings E(B-V)M33. Equivalent widths or upper limits have been measured for the λ5780 DIB towards each star. DIBs were detected towards 20 stars, demonstrating that their carriers are abundant in M33. The relationship with reddening is found to be at the upper end of the range observed in the Milky Way. The line of sight towards one star has an unusually strong ratio of DIB equivalent width to E(B-V)M33, and a total of seven DIBs were detected towards this star.
This work focusses on MWC 922, the central object in the Red Square Nebula. We obtained low and medium resolution spectra of both, the central object and the surrounding nebula, using the DIS and TSpec spectrograph. The spectra show the whole spectral range between ~3 500 Å up to ~25 000 Å. The central object shows a plethora of emission lines, including many Fe II and forbidden Fe [II] lines. Here, we present the inventory of the emission lines of the central object, MWC 922. Future work will comprise the identification of the nebula emission lines by using newly obtained X-Shooter spectra. That way we want to gain further insight into the physical and chemical conditions in this environment. A comparison of the Red Square to the Red Rectangle Nebula is anticipated and will guide our search for DIBs in emission.
Pelagic ecosystems and their fisheries are of particular economic and social importance to the countries and territories of the Wider Caribbean for various reasons. In some countries (e.g. Barbados, Grenada) commercial pelagic fisheries already contribute significantly to total landings and seafood export foreign exchange earnings. Ports and postharvest facilities service the vessels, ranging from artisanal canoes to industrial longliners, and their catch which often reaches tourists as well as locals (Mahon and McConney 2004). In other places where the focus has previously been on inshore and demersal fisheries (e.g. Antigua and Barbuda, Belize) there is growing interest in the potential of pelagic fisheries development. This potential lies not only in commercial fisheries, but also in the high-revenue and conservation-aware recreational fisheries well established in a few locations (e.g. Puerto Rico, Costa Rica) and undertaken at a lower level in many others.
Underlying all of this is the complexity due to many of the valued pelagics being migratory or highly migratory shared and straddling stocks falling under the 1995 United Nations Fish Stocks Agreement and subject to several international instruments and management regimes, such as those of the International Commission for the Conservation of Atlantic Tunas (ICCAT). The web of linkages across Caribbean marine jurisdictions and organizations is complex (McConney et al. 2007). The related issues call for an ecosystem approach (McConney and Salas Chapter 7; Schuhmann et al. Chapter 8) and some progress has already been made at multiple levels (Fanning and Oxenford Chapter 16; Singh-Renton et al. Chapter 14).
This synthesis chapter presents the outputs of facilitated symposium sessions specifically related to achieving and implementing a shared vision for the pelagic ecosystem in marine ecosystem based management (EBM) in the Wider Caribbean. The methodology was described in Chapter 1 of this volume. This chapter first describes a vision for the pelagic ecosystem and reports on the priorities assigned to the identified vision elements. It then addresses how the vision might be achieved by taking into account assisting factors (those that facilitate achievement) and resisting factors (those that inhibit achievement). The chapter concludes with guidance on the strategic direction needed to implement the vision, identifying specific actions to be undertaken for each of the vision elements.