We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: Contingency management (CM) procedures yield measurable reductions in cocaine use. This poster describes a trial aimed at using CM as a vehicle to show the biopsychosocial health benefits of reduced use, rather than total abstinence, the currently accepted metric for treatment efficacy. METHODS/STUDY POPULATION: In this 12-week, randomized controlled trial, CM was used to reduce cocaine use and evaluate associated improvements in cardiovascular, immune, and psychosocial well-being. Adults aged 18 and older who sought treatment for cocaine use (N=127) were randomized into three groups in a 1:1:1 ratio: High Value ($55) or Low Value ($13) CM incentives for cocaine-negative urine samples or a non-contingent control group. They completed outpatient sessions three days per week across the 12-week intervention period, totaling 36 clinic visits and four post-treatment follow-up visits. During each visit, participants provided observed urine samples and completed several assays of biopsychosocial health. RESULTS/ANTICIPATED RESULTS: Preliminary findings from generalized linear mixed effect modeling demonstrate the feasibility of the CM platform. Abstinence rates from cocaine use were significantly greater in the High Value group (47% negative; OR = 2.80; p = 0.01) relative to the Low Value (23% negative) and Control groups (24% negative;). In the planned primary analysis, the level of cocaine use reduction based on cocaine-negative urine samples will serve as the primary predictor of cardiovascular (e.g., endothelin-1 levels), immune (e.g., IL-10 levels) and psychosocial (e.g., Addiction Severity Index) outcomes using results from the fitted models. DISCUSSION/SIGNIFICANCE: This research will advance the field by prospectively and comprehensively demonstrating the beneficial effects of reduced cocaine use. These outcomes can, in turn, support the adoption of reduced cocaine use as a viable alternative endpoint in cocaine treatment trials.
This paper provides an overview and appraisal of the International Design Engineering Annual (IDEA) challenge - a virtually hosted design hackathon run with the aim of generating a design research dataset that can provide insights into design activities at virtually hosted hackathons. The resulting dataset consists of 200+ prototypes with over 1300 connections providing insights into the products, processes and people involved in the design process. The paper also provides recommendations for future deployments of virtual hackathons for design research.
Prototyping is a vital activity in product development. For reasons of time, cost and level of definition, low fidelity representations of products are used to advance understanding and progress design. With the advent of Mixed Reality prototyping, the ways in which abstractions of different fidelities can be created have multiplied, but there is no guidance on how best to specify this abstraction. In this paper, a taxonomy of the dimensions of product fidelity is proposed so that both designers and researchers can better understand how fidelity can be managed to maximise prototype value.
The RemoveDEBRIS mission has been the first mission to successfully demonstrate, in-orbit, a series of technologies that can be used for the active removal of space debris. The mission started late in 2014 and was sponsored by a grant from the EC that saw a consortium led by the Surrey Space Centre to develop the mission, from concept to in-orbit demonstrations, that terminated in March 2019. Technologies for the capture of large space debris, like a net and a harpoon, have been successfully tested together with hardware and software to retrieve data on non-cooperative target debris kinematics from observations carried out with on board cameras. The final demonstration consisted of the deployment of a drag-sail to increase the drag of the satellite to accelerate its demise.
In June 2012, the Botswana Ministry of Health and Wellness (MOHW; Gaborone, Botswana) initiated a national Emergency Medical Services (EMS) system in response to significant morbidity and mortality associated with prehospital emergencies. The MOHW requested external expertise to train its developing workforce. Simulation-based training was planned to equip these health care providers with clinical knowledge, procedural skills, and communication techniques.
Objective
The objective of this study was to assess the educational needs of the pioneer Botswana MOHW EMS providers based on retrospective EMS logbook review and EMS provider feedback to guide development of a novel educational curriculum.
Methods
Data were abstracted from a representative sample of the Gaborone, Botswana MOHW EMS response log from 2013-2014 and were quantified into the five most common call types for both adults and children. Informal focus groups with health professionals and EMS staff, as well as surveys, were used to rank common response call types and self-perceived educational needs.
Results
Based on 1,506 calls, the most common adult response calls were for obstetric emergencies, altered mental status, gastrointestinal/abdominal pain, trauma, gynecological emergencies, and cardiovascular and respiratory distress-related emergencies. The most common pediatric response calls were for respiratory distress, gastrointestinal complaints/dehydration, trauma and musculoskeletal injuries, newborn delivery, seizures, and toxic ingestion/exposure. The EMS providers identified these same chief complaints as priorities for training using the qualitative approach. A locally relevant, simulation-based curriculum for the Botswana MOHW EMS system was developed and implemented based on these data.
Conclusions
: Trauma, respiratory distress, gastrointestinal complaints, and puerperal/perinatal emergencies were common conditions for all age groups. Other age-specific conditions were also identified as educational needs based on epidemiologic data and provider feedback. This needs assessment may be useful when designing locally relevant EMS curricula in other low-income and middle-income countries.GlombNW, KosokoAA, DoughtyCB, RusMC, ShahMI, CoxM, GalapiC, ParkesPS, KumarS, LabaB.Needs Assessment for Simulation Training for Prehospital Providers in Botswana. Prehosp Disaster Med. 2018;33(6):621–626.
Introduction: Point of care ultrasound (PoCUS) has become an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). Current established protocols (e.g. RUSH and ACES) were developed by expert user opinion, rather than objective, prospective data. Recently the SHoC Protocol was published, recommending 3 core scans; cardiac, lung, and IVC; plus other scans when indicated clinically. We report the abnormal ultrasound findings from our international multicenter randomized controlled trial, to assess if the recommended 3 core SHoC protocol scans were chosen appropriately for this population. Methods: Recruitment occurred at seven centres in North America (4) and South Africa (3). Screening at triage identified patients (SBP<100 or shock index>1) who were randomized to PoCUS or control (standard care with no PoCUS) groups. All scans were performed by PoCUS-trained physicians within one hour of arrival in the ED. Demographics, clinical details and study findings were collected prospectively. A threshold incidence for positive findings of 10% was established as significant for the purposes of assessing the appropriateness of the core recommendations. Results: 138 patients had a PoCUS screen completed. All patients had cardiac, lung, IVC, aorta, abdominal, and pelvic scans. Reported abnormal findings included hyperdynamic LV function (59; 43%); small collapsing IVC (46; 33%); pericardial effusion (24; 17%); pleural fluid (19; 14%); hypodynamic LV function (15; 11%); large poorly collapsing IVC (13; 9%); peritoneal fluid (13; 9%); and aortic aneurysm (5; 4%). Conclusion: The 3 core SHoC Protocol recommendations included appropriate scans to detect all pathologies recorded at a rate of greater than 10 percent. The 3 most frequent findings were cardiac and IVC abnormalities, followed by lung. It is noted that peritoneal fluid was seen at a rate of 9%. Aortic aneurysms were rare. This data from the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients, supports the use of the prioritized SHoC protocol, though a larger study is required to confirm these findings.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.
Introduction: Point of care ultrasonography (PoCUS) is an established tool in the initial management of hypotensive patients in the emergency department (ED). It has been shown rule out certain shock etiologies, and improve diagnostic certainty, however evidence on benefit in the management of hypotensive patients is limited. We report the findings from our international multicenter RCT assessing the impact of a PoCUS protocol on diagnostic accuracy, as well as other key outcomes including mortality, which are reported elsewhere. Methods: Recruitment occurred at 4 North American and 3 Southern African sites. Screening at triage identified patients (SBP<100 mmHg or shock index >1) who were randomized to either PoCUS or control groups. Scans were performed by PoCUS-trained physicians. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. Final chart review was blinded to initial impressions and PoCUS findings. Categorical data was analyzed using Fishers two-tailed test. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. The perceived shock category changed more frequently in the PoCUS group 20/127 (15.7%) vs. control 7/125 (5.6%); RR 2.81 (95% CI 1.23 to 6.42; p=0.0134). There was no significant difference in change of diagnostic impression between groups PoCUS 39/123 (31.7%) vs control 34/124 (27.4%); RR 1.16 (95% CI 0.786 to 1.70; p=0.4879). There was no significant difference in the rate of correct category of shock between PoCUS (118/127; 93%) and control (113/122; 93%); RR 1.00 (95% CI 0.936 to 1.08; p=1.00), or for correct diagnosis; PoCUS 90/127 (70%) vs control 86/122 (70%); RR 0.987 (95% CI 0.671 to 1.45; p=1.00). Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We found that the use of PoCUS did change physicians’ perceived shock category. PoCUS did not improve diagnostic accuracy for category of shock or diagnosis.
Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate.
Method
The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009.
Results
There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status.
Conclusions
Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.
The Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS) has found that the proportional elevation in the US Army enlisted soldier suicide rate during deployment (compared with the never-deployed or previously deployed) is significantly higher among women than men, raising the possibility of gender differences in the adverse psychological effects of deployment.
Method
Person-month survival models based on a consolidated administrative database for active duty enlisted Regular Army soldiers in 2004–2009 (n = 975 057) were used to characterize the gender × deployment interaction predicting suicide. Four explanatory hypotheses were explored involving the proportion of females in each soldier's occupation, the proportion of same-gender soldiers in each soldier's unit, whether the soldier reported sexual assault victimization in the previous 12 months, and the soldier's pre-deployment history of treated mental/behavioral disorders.
Results
The suicide rate of currently deployed women (14.0/100 000 person-years) was 3.1–3.5 times the rates of other (i.e. never-deployed/previously deployed) women. The suicide rate of currently deployed men (22.6/100 000 person-years) was 0.9–1.2 times the rates of other men. The adjusted (for time trends, sociodemographics, and Army career variables) female:male odds ratio comparing the suicide rates of currently deployed v. other women v. men was 2.8 (95% confidence interval 1.1–6.8), became 2.4 after excluding soldiers with Direct Combat Arms occupations, and remained elevated (in the range 1.9–2.8) after adjusting for the hypothesized explanatory variables.
Conclusions
These results are valuable in excluding otherwise plausible hypotheses for the elevated suicide rate of deployed women and point to the importance of expanding future research on the psychological challenges of deployment for women.
We describe the efficacy of enhanced infection control measures, including those recommended in the Centers for Disease Control and Prevention’s 2012 carbapenem-resistant Enterobacteriaceae (CRE) toolkit, to control concurrent outbreaks of carbapenemase-producing Enterobacteriaceae (CPE) and extensively drug-resistant Acinetobacter baumannii (XDR-AB).
Design
Before-after intervention study.
Setting
Fifteen-bed surgical trauma intensive care unit (ICU).
Methods
We investigated the impact of enhanced infection control measures in response to clusters of CPE and XDR-AB infections in an ICU from April 2009 to March 2010. Polymerase chain reaction was used to detect the presence of blaKPC and resistance plasmids in CRE. Pulsed-field gel electrophoresis was performed to assess XDR-AB clonality. Enhanced infection-control measures were implemented in response to ongoing transmission of CPE and a new outbreak of XDR-AB. Efficacy was evaluated by comparing the incidence rate (IR) of CPE and XDR-AB before and after the implementation of these measures.
Results
The IR of CPE for the 12 months before the implementation of enhanced measures was 7.77 cases per 1,000 patient-days, whereas the IR of XDR-AB for the 3 months before implementation was 6.79 cases per 1,000 patient-days. All examined CPE shared endemic blaKPC resistance plasmids, and 6 of the 7 XDR-AB isolates were clonal. Following institution of enhanced infection control measures, the CPE IR decreased to 1.22 cases per 1,000 patient-days (P = .001), and no more cases of XDR-AB were identified.
Conclusions
Use of infection control measures described in the Centers for Disease Control and Prevention’s 2012 CRE toolkit was associated with a reduction in the IR of CPE and an interruption in XDR-AB transmission.
We investigated whether straight-line distance from residential compounds to healthcare facilities influenced mortality, the incidence of pneumonia and vaccine efficacy against pneumonia in rural Gambia. Clinical surveillance for pneumonia was conducted on 6938 children living in the catchment areas of the two largest healthcare facilities. Deaths were monitored by three-monthly home visits. Children living >5 km from the two largest healthcare facilities had a 2·78 [95% confidence interval (CI) 1·74–4·43] times higher risk of all-cause mortality compared to children living within 2 km of these facilities. The observed rate of clinical and radiological pneumonia was lower in children living >5 km from these facilities compared to those living within 2 km [rate ratios 0·65 (95% CI 0·57–0·73) and 0·74 (95% CI 0·55–0·98), respectively]. There was no association between distance and estimated pneumococcal vaccine efficacy. Geographical access to healthcare services is an important determinant of survival and pneumonia in children in rural Gambia.
The US Army suicide rate has increased sharply in recent years. Identifying significant predictors of Army suicides in Army and Department of Defense (DoD) administrative records might help focus prevention efforts and guide intervention content. Previous studies of administrative data, although documenting significant predictors, were based on limited samples and models. A career history perspective is used here to develop more textured models.
Method
The analysis was carried out as part of the Historical Administrative Data Study (HADS) of the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). De-identified data were combined across numerous Army and DoD administrative data systems for all Regular Army soldiers on active duty in 2004–2009. Multivariate associations of sociodemographics and Army career variables with suicide were examined in subgroups defined by time in service, rank and deployment history.
Results
Several novel results were found that could have intervention implications. The most notable of these were significantly elevated suicide rates (69.6–80.0 suicides per 100 000 person-years compared with 18.5 suicides per 100 000 person-years in the total Army) among enlisted soldiers deployed either during their first year of service or with less than expected (based on time in service) junior enlisted rank; a substantially greater rise in suicide among women than men during deployment; and a protective effect of marriage against suicide only during deployment.
Conclusions
A career history approach produces several actionable insights missed in less textured analyses of administrative data predictors. Expansion of analyses to a richer set of predictors might help refine understanding of intervention implications.
It has been postulated that aging is the consequence of an accelerated accumulation of somatic DNA mutations and that subsequent errors in the primary structure of proteins ultimately reach levels sufficient to affect organismal functions. The technical limitations of detecting somatic changes and the lack of insight about the minimum level of erroneous proteins to cause an error catastrophe hampered any firm conclusions on these theories. In this study, we sequenced the whole genome of DNA in whole blood of two pairs of monozygotic (MZ) twins, 40 and 100 years old, by two independent next-generation sequencing (NGS) platforms (Illumina and Complete Genomics). Potentially discordant single-base substitutions supported by both platforms were validated extensively by Sanger, Roche 454, and Ion Torrent sequencing. We demonstrate that the genomes of the two twin pairs are germ-line identical between co-twins, and that the genomes of the 100-year-old MZ twins are discerned by eight confirmed somatic single-base substitutions, five of which are within introns. Putative somatic variation between the 40-year-old twins was not confirmed in the validation phase. We conclude from this systematic effort that by using two independent NGS platforms, somatic single nucleotide substitutions can be detected, and that a century of life did not result in a large number of detectable somatic mutations in blood. The low number of somatic variants observed by using two NGS platforms might provide a framework for detecting disease-related somatic variants in phenotypically discordant MZ twins.
Identifications of new material collected by Cambridge expeditions and a new find from Hopen are discussed with reference to earlier work and to the stratigraphic setting. All available generic names relating to Svalbard material are listed systematically. A supplement to the alphabetical index of records (Buchan et al., 1965) is added.
From a sample of more than 100 remnants from major and minor hydrodynamic binary galaxy merger simulations (Cox 2004; Cox et al. 2005), we find that stellar remnants are mostly oblate while dark matter halos are mostly prolate or triaxial. Shapes are determined by iteratively diagonalizing a moment-of-inertia tensor. The preferred axes of the two shapes are almost always nearly perpendicular. This can be understood by considering the influence of angular momentum and dissipation during the merger. If binary major mergers of spiral galaxies are responsible for the formation of elliptical galaxies or some subpopulation of elliptical galaxies, then the galaxies can be be expected to be oblate and the dark matter halos prolate with the two preferred axes perpendicular to each other.
Elements of kinematical and dynamical modeling of elliptical galaxies arepresented.In projection, NFW models resemble Sérsic models, but with a very narrow range of shapes (m = 3±1).The total density profile of ellipticals cannot be NFW-like because the predicted local M/L and aperture velocity dispersion within aneffective radius (Re) are much lower than observed. Stars must then dominate ellipticalsout to a few Re.Fitting an NFW model to the total density profile of Sérsic+NFW (stars+dark matter [DM]) ellipticals results in very high concentration parameters, asfoundby X-ray observers.Kinematical modeling of ellipticals assuming an isotropic NFW DM modelunderestimates M/L at the virial radius by a factor of 1.6 to 2.4, becausedissipationless ΛCDM halos have slightly different density profilesand slightly radial velocity anisotropy. In N-body+gas simulations of ellipticals as merger remnants ofspirals embedded in DM halos,the slope of the DM density profile is steeperwhen the initial spiral galaxies are gas-rich.The Hansen & Moore (2006)relation between anisotropy and the slope of the density profile breaks down for gas and DM, but the stars follow an analogous relation with slightly less radial anisotropies for a given density slope.Using kurtosis (h4) to infer anisotropy in ellipticals is dangerous, as h4 is also sensitive to small levels of rotation.The stationary Jeans equation provides accurate masses out to 8Re.The discrepancy between the modeling of Romanowsky et al. (2003),indicating a dearth of DM in ellipticals,and the simulations analyzed by Dekel et al. (2005), whichmatch thespectroscopic observations of ellipticals, is partly due to radial anisotropy and to observing oblate ellipticals face-on.However,one of the 15 solutions to the orbit modeling of Romanowsky et al. is found to have an amount andconcentration ofDMconsistent withΛCDM predictions.
The kinematical properties of elliptical galaxies formed during the mergersof equal mass, stars+gas+dark matter spiral galaxies are compared to theobserved low velocity dispersions found for planetary nebulae on the outskirts of ellipticals,which have been interpreted as pointing to a lack of dark matter inellipticals (which poses a problem for the standard model of galaxy formation).We find that the velocity dispersion profiles of the stars in thesimulated ellipticalsmatch well the observed ones.The low outer stellar velocity dispersions are mainly caused by the radialorbits of the outermost stars, which, for a given binding energy must havelow angular momentum to reach their large radial distances, usually driven out along tidal tails.