We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Early Miocene land mammals from eastern North America are exceedingly rare. Over the past several decades a small, but significant, vertebrate fauna has been recovered by paleontologists and citizen scientists from the Belgrade Formation at the Martin Marietta Belgrade Quarry in eastern North Carolina. This assemblage has 12 land mammal taxa, including beaver (Castoridae), stem lagomorph, carnivorans (Mustelidae, Ailuridae), horses (Equidae), rhinoceros (Rhinocerotidae), tapir (Tapiridae), peccary (Tayassuidae), anthracothere (Anthracotheriidae), entelodont (Entelodontidae), and protoceratid (Protoceratidae). Taken together, the biochronology of this Maysville Local Fauna indicates a late Arikareean (Ar3/Ar4) to early Hemingfordian (He1) North American Land Mammal Age (NALMA). This interval, which includes the Runningwater Chronofauna, documents numerous important Holarctic immigrants, including Amphictis, Craterogale, and cf. Menoceras found at this locality. Strontium isotope stratigraphy (SIS) of shark teeth collected in situ from the Belgrade Formation yield an age of 21.4 ± 0.13 Ma, which validates the age of interbedded land mammals within this unit. It also is consistent with the late Arikareean (Ar3/Ar4) biochronology and Aquitanian Neogene marine stage. New SIS analyses of oysters (Striostrea gigantissima) and clams (Chione) from this mine, previously assigned to late Oligocene or Late Miocene, are significantly older (28.0 ± 0.22 Ma and 27.6 ± 0.26 Ma, respectively) than the land mammals. Depending upon stratigraphic interpretations, these may confirm an older marine facies within the Belgrade Formation. This locality is important because of its marine and terrestrial tie-ins that facilitate intercalibration of both NALMAs and Cenozoic marine stages.
Protein fermentation in the human gut is often associated with adverse health effects. Hence, understanding the fermentation characteristics of dietary undigested proteins is important for a comprehensive nutritional value of foods. This study investigated the protein fermentation kinetics of diet-derived proteins from 31 different foods using an in vitro model and human faecal inoculum. The undigested diet-derived protein substrate originated from porcine ileal digesta obtained from assessment of the digestible indispensable amino acid score (DIAAS) of the foods. Significant variations in fermentation kinetic parameters, particularly in maximum gas production rate (Rmax) and time to reach cumulative gas production from the substrate (TGPs) were observed. The Rmax ranged from 15.5 ± 0.7 mL/h for wheat bran-derived to 24.5 ± 0.9 mL/h for oatmeal-derived proteins. Egg-derived proteins had the shortest TGPs (14.7 ± 0.7 h), while mushroom-derived proteins had the longest (27.6 ± 7.1 h). When foods were categorised into five groups (‘animal protein’, ‘grains’, ‘legumes’, ‘fungi, algae, and microorganisms’, and ‘others’), no significant differences were found in fermentation kinetics parameters. Samples were additionally incubated with porcine inoculum to assess potential donor-species effects. Human inoculum showed significantly lower Rmax, cumulative gas production, and microbiota turnover than porcine inoculum, indicating reduced fermentative activity. Linear regression analysis revealed correlations between human and porcine-derived inoculum only for Rmax (R2 = 0.78, P < 0.01) and TGPs (R² = 0.17, P < 0.05). These findings underscore the importance of using human inoculum in in vitro studies to better predict health implications of foods with DIAAS values.
Understanding protein fermentation in the hindgut of pigs is essential due to its implications for health, and ileal digesta is commonly used to study this process in vitro. This study aimed to assess the feasibility of utilising in vitro digested residues as a replacement for ileal digesta in evaluating the protein fermentation potential. In vitro residues from cottonseed meal, maize germ meal, peanut meal, rapeseed cake, rapeseed meal, soyabean meal and sunflower meal were analysed using a modified gas production (GP) technique and curve fitting model to determine their fermentation dynamics and compare with the use of ileal digesta. Significant variations were observed in GP parameters between in vitro digested residues, indicating differences in nitrogen utilisation by fecal microbiota. Soyabean meal and sunflower meal exhibited the highest maximum GP rates (Rmax), with values of 29·5 ± 0·6 and 28·0 ± 1·2 ml/h, respectively, while maize germ meal showed slowest protein utilisation (17·3 ± 0·2 ml/h). A positive relationship was found between the Rmax of in vitro residues and ileal digesta (R2 = 0·85, P < 0·01). However, GP potential (GPs) showed a tendency for a negative relationship (R2 = 0·39, P < 0·1), likely due to narrow observed GPs values and the presence of varied endogenous proteins in ileal digesta. Our results demonstrate the potential of using in vitro digested residues as a substitute for ileal digesta in assessing the fermentation potential of protein ingredients, particularly regarding the rate of protein fermentation.
Recent findings show that it is possible in some cases to robustly and durably change implicit impressions of novel individuals. This work presents a challenge to long-standing theoretical assumptions about implicit impressions, and raises new research directions for changing and reducing implicit bias toward outgroups. Namely, implicit impressions of newly encountered individuals and groups are more amenable to robust change and updating than previously assumed, and some of the lessons from this work point to when and how we might try to change implicit bias toward well-known and familiar stigmatized groups and individuals.
The Glasgow Coma Scale (GCS) was devised in 1974 as a way of tracking the progress of neurosurgical coma patients. It is comprised of three components: eye movement, response to verbal commands, and motor function. Since then, it has become the primary tool in Emergency Medical Services (EMS) and emergency departments for assessing cognitive function and triaging patients in the setting of acute trauma. However, the GCS was never intended to be used in such a way. It has been demonstrated that there is a high degree of inter-rater variability when assigning GCS scores for trauma patients. Potential differences in GCS score assignments between different countries were examined. It was hypothesized there would be differences in mean total and component scores.
Methods:
Using de-identified data from the Pan-Asian Trauma Outcomes Study (PATOS), the distributions of GCS scores from six countries were assessed: Japan, Korea, Malaysia, Taiwan, Thailand, and Vietnam. Using SPSS data analysis, a one-way ANOVA and Bonferroni post-hoc tests were performed to compare the means of the three GCS components and the total GCS scores reported by EMS personnel caring for trauma patients.
Results:
Data from 15,173 cases showed significant differences in mean total GCS score between countries (P <.001) as well as in mean component GCS scores (P <.001 for each of eye, verbal, and motor). Post-hoc tests showed that EMS personnel in Korea assigned significantly lower scores compared to all other countries in both component and total GCS scores. Field personnel in Japan, Malaysia, and Vietnam assigned the highest scores and significantly differed from the other three countries on component and total scores; Thailand and Taiwan had similar scores but significantly differed from the other four countries on component and total scores. Visual inspection of mean component and total GCS score histograms revealed differences in score assignment patterns among countries.
Conclusions:
There are a number of significant differences in the mean total and component GCS scores assigned by EMS personnel in the six Asian countries studied. More investigation is necessary to determine if there is clinical significance to these differences in GCS score assignments, as well as the reasons for the differences.
This edited volume set out to explore how resilience, adaptive peacebuilding and transitional justice can help societies recover after collective violence. To do so, it examined diverse societies across Africa, Asia, Europe, Latin America and the Middle East that have experienced, or are continuing to experience, violence. The eight case studies – Bosnia-Herzegovina (BiH), Rwanda, Uganda, Bangladesh, Cambodia, Colombia, Guatemala and Palestine – provide in-depth conceptual and empirical analyses of resilience and adaptive peacebuilding in a range of transitional justice settings. This final chapter will reflect on what we have learned from the cases covered in this volume. In particular, it will discuss how they enrich our understanding of the concepts of resilience, adaptive peacebuilding and transitional justice, and what they tell us about the complex ways that resilience and adaptive peacebuilding manifest in transitional and post-conflict settings. The chapter begins with a discussion of adaptive peacebuilding and resilience in transitional justice contexts.
OBJECTIVES/GOALS: The goal of this study is to better understand the homicide victim population who were institutionalized within 30 days prior to their death. Improved knowledge of this population can potentially prevent these future homicides. METHODS/STUDY POPULATION: A retrospective analysis of the 36 states included in the 2003-2017 National Violent Death Reporting System was performed. Demographics of recently institutionalized homicide victims (RIHV) in the last 30 days were compared to homicide victims who were not recently institutionalized. Circumstances of the homicide, such as suspected gang involvement, were also compared. Parametric and non-parametric statistical analyses were performed. Significance was set at p<0.05. RESULTS/ANTICIPATED RESULTS: There were 81,229 homicides with 992 (1.2%) RIHV. The majority of RIHV were Black (49.6%) and older than victims who were not recently institutionalized (37.2 vs. 34.8, p<0.001). RIHV had a high school degree or higher in 54.8% of cases and the primary homicide weapon was a firearm in 67% of the deaths. They were more likely to be homeless (3.1% vs. 1.5%), have a mental health diagnosis (9.2% vs. 2.3%), abuse alcohol (6.1% vs. 2.2%), or abuse other substances (15.2% vs. 5.8%) [all p <0.001]. These victims were most commonly institutionalized in a correctional facility or a hospital compared to other facilities such as nursing homes. Homicide circumstances for RIHV were more likely to involve abuse/neglect (4.3% vs. 2.2%, p<0.001), gang violence (7.6% vs. 5.6%, p = 0.002), or a hate crime (1.0% vs. 0.1%. p<0.001). DISCUSSION/SIGNIFICANCE OF IMPACT: Contact with an institution such as a hospital or prison provides high-risk patients the opportunity to potentially participate in violence intervention programs. These institutions should seek to identify and intervene on this population to reduce the risk of violent homicides.
Roderick Chisholm has offered a new attempt to define knowledge in the second edition of Theory of Knowledge. The purpose of this paper is to present an objection to that definiton.
Here is the proposed definition (numbering below follows the text):
D6.4 h is known by 5 =df h is accepted by S; h is true; and h is nondefectively evident for 5.
To understand D6.4 we need to know what it is for a proposition to be nondefectively evident for a person. That has the following definition:
D6.3 h is nondefectively evident for S=df Either h is sertain for S, or h is evident for S and is entailed by a conjunction of propositions each having for S a basis which is not a basis for any false proposition.
There are four technical expressions in D6.3: “entailed,” “certain” “evident,” and “basis.“
Analyzing the repetitive pattern of historical lead poisoning that to present-day has shaped our legislatorial systems regarding lead consumption, this work focuses on creating awareness and caution toward lead halide perovskite commercialization while concurrently pointing out considerations and ambiguity in policies and regulations.
Lead halide perovskites have caused a paradigm shift in state-of-the-art photovoltaic technology half a decade ago and have gained tremendous momentum ever since. Given their seemingly imminent commercialization, rigorous scrutiny regarding their potential environmental impact is becoming increasingly relevant. In light of the current need for sustainable energy resources, several start-up and spin-off companies have been established, initially promising modules on the market by the end of 2017. On the downside, lead representing approximately one third by weight of the absorber layer in such photovoltaic devices is enough reason to become wary about the potential environmental impact of their large-scale implementation. Whilst many have wondered where the acceptable boundaries lie regarding lead consumption, it remains a focal point in many discussions, as it seems almost unattainable to ban lead usage from our society. Currently listed as one of the ten chemicals of major health concern by the World Health Organization, the magnitude of misgivings expands even more as recent studies also demonstrate promising applications of lead halide perovskites in light emitting diodes, lasers, batteries, and photodetectors. Hence, there is no doubt that a discussion should be commenced on how to assess and handle the impact of lead in a new technology of such high potential.
By reflecting on the historical experience gained from anthropogenic lead poisoning that is still shaping our legislatorial systems at present-day, this work investigates and carefully scrutinizes current legislation that governs the exploitation of lead halide perovskites in optoelectronic applications. Analyzing the repetitive pattern of historical lead consumption, focus is extended on creating awareness and caution toward lead halide perovskite commercialization while concurrently pointing out considerations and ambiguity in policies and regulations. Ultimately, this work aims to initialize a discussion on “if” and “how” this burgeoning class of materials can enter the consumer market.
We assume that firms are more risk averse than households and that they manage their risk through a financial sector, which consists of learning and hedging. Firms that learn (by observing demand shocks) face less uncertainty and produce more than firms that hedge (by selling future production at a fixed price). If a policy or parameter change stabilizes the economy, then there is less learning and usually less production. Welfare, however, is usually maximized when the financial sector, which requires inputs but does not directly provide utility or affect production, is smallest. Monetary policy can improve welfare by either taxing learning or subsidizing hedging. If firms are risk averse over nominal profits instead of real profits, then interest rate policy can also improve welfare by stabilizing prices and thus minimizing the size of the financial sector.
The present study aimed to evaluate the inter-individual variability in fermentation of standard fibrous substrates by faecal inocula from ten healthy adult female cats. Substrates were citrus pectin (CP), fructo-oligosaccharides (FOS), guar gum (GG), sugar beet pulp (SBP) and wheat middlings (WM). Each substrate was incubated with faecal inoculum from each cat. Gas production was measured continuously during the 48 h incubation and SCFA and organic matter disappearance (only SBP and WM) after incubation. Out of ten cats, nine produced faeces on the days of inoculum preparation. The substrates contrasted in terms of fermentation parameters measured. The inter-individual variability was in general lower for the more simple and pure substrates (CP, FOS, GG) than for the more complex substrates containing mixtures of fibres (SBP, WM). Furthermore, for total SCFA and gas produced, inter-individual variability was lower than for proportions of butyrate and of branched-chain fatty acids and for the parameters of gas production kinetics. It is concluded that the variability in in vitro fermentation parameters is associated with the complexity of fibrous substrates. The presented data are instrumental for the calculation of number of faecal donors required for precise in vitro characterisation of the fermentability of dietary fibres. In addition, the number of faecal donors should be adjusted to the specific fermentation parameter(s) of interest.
To gain knowledge on the precision of an in vitro method for characterisation of the fermentability of dietary fibres, this study aimed to evaluate the repeatability and reproducibility of such a method. Substrates used were citrus pectin (CP), fructo-oligosaccharides (FOS), guar gum (GG), sugar beet pulp (SBP) and wheat middlings (WM). Each substrate was incubated with faecal inoculum from five cats with three replicates for each substrate–cat combination. Gas production was measured continuously during the 48 h incubation and SCFA and organic matter disappearance (only SBP and WM) were determined after incubation. Four consecutive runs were performed. The within-run variability (repeatability) was generally lower for the more simple and pure substrates (CP, FOS, GG) than for the more complex substrates containing mixtures of fibres (SBP, WM). Replicates showed high variability, in particular for SCFA profiles and parameters of gas production kinetics. The between-run CV (reproducibility) for the measured parameters were, in general, below 10 % for CP, FOS and GG and higher values were obtained for SBP and WM. It is concluded that for precise dietary fibre characterisation, the number of replicates should be multiple and adjusted according to the variability of the parameters of interest and the complexity of fibres. The method yielded reproducible results with some variation in absolute values obtained, which may have an impact on the significance level of the differences among substrates.
While the overall survival rate for out-of-hospital cardiac arrest (OHCA) is low, ranging from 5%-10%, several characteristics have been shown to decrease mortality, such as presence of bystander cardiopulmonary resuscitation (CPR), witnessed vs unwitnessed events, and favorable initial rhythm (VF/VT). More recently, studies have shown that modified CPR algorithms, such as chest-compression only or cardio-cerebral resuscitation, can further increase survival rates in OHCA. Most of these studies have included only OHCA patients with “presumed cardiac etiology,” on the assumption that airway management is of lesser impact than chest compressions in these patients. However, prehospital personnel often lack objective and consistent criteria to assess whether an OHCA is of cardiac or non-cardiac etiology.
Hypothesis/Problem
The relative proportions of cardiac vs non-cardiac etiology in published data sets of OHCA in the peer-reviewed literature were examined in order to assess the variability of prehospital clinical etiology assessment.
Methods
A Medline (US National Library of Medicine, National Institutes of Health; Bethesda, Maryland USA) search was performed using the subject headings “OHCA” and “Emergency Medical Services” (EMS). Studies were included if they reported prevalence of cardiac etiology among OHCA in the entire patient sample, or in all arms of a comparison study. Studies that either did not report etiology of OHCA, or that excluded all cardiac or non-cardiac etiologies prior to reporting clinical data, were excluded.
Results
Twenty-four studies were identified, containing 27 datasets of OHCA which reported the prevalence of presumed cardiac vs non-cardiac etiology. These 27 datasets were drawn from 15 different countries. The prevalence of cardiac etiology among OHCA ranged from 50% to 91%. No obvious patterns were found regarding database size, year of publication, or global region (continent) of origin.
Conclusions:
There exists significant variation in published rates of cardiac etiology among OHCAs. While some of this variation likely reflects different actual rates of cardiac etiologies in the sampled populations, varying definitions of cardiac etiology among prehospital personnel or varying implementation of existing definitions may also play a role. Different proportions of cardiac vs non-cardiac etiology of OHCA in a sample could result in entirely different interpretations of data. A more specific consensus definition of cardiac etiology than that which currently exists in the Utstein template may provide better guidance to prehospital personnel and EMS researchers in the future.
CarterRM, ConeDC. When is a Cardiac Arrest Non-Cardiac?Prehosp Disaster Med. 2017;32(5):523–527.
Experiments were conducted to determine the effect of lime pre-treatment on the chemical composition and in vitro rumen degradability of date palm leaves (DPL). Lime pre-treatments, with or without oxygen supply, were applied for 1, 2 and 3 weeks at 25 and 40 °C. Lime was neutralized by the Calcium-Capturing-by-Carbonation process. Delignification and in vitro rumen gas production were significantly influenced by duration, temperature and oxygen. At 40 °C, oxygen presence stimulated more delignification and subsequently increased in vitro rumen degradability. Lime pre-treatment with 0·2 g calcium hydroxide (Ca(OH)2)/g dry biomass for 3 weeks at 40 °C in the presence of oxygen resulted in a 3-fold increase in gas production after 24 h of incubation, compared with untreated biomass. Lime treatment of DPL with aeration resulted in higher lignin removal and subsequent rumen degradability than without aeration. A techno-economic analysis is needed to select the most efficient and economically feasible pre-treatment procedure.
It is unclear which pediatric disaster triage (PDT) strategy yields the best accuracy or best patient outcomes.
Methods
We conducted a cross-sectional analysis on a sample of emergency medical services providers from a prospective cohort study comparing the accuracy and triage outcomes for 2 PDT strategies (Smart and JumpSTART) and clinical decision-making (CDM) with no algorithm. Participants were divided into cohorts by triage strategy. We presented 10-victim, multi-modal disaster simulations. A Delphi method determined patients’ expected triage levels. We compared triage accuracy overall and for each triage level (RED/Immediate, YELLOW/Delayed, GREEN/Ambulatory, BLACK/Deceased).
Results
There were 273 participants (71 JumpSTART, 122 Smart, and 81 CDM). There was no significant difference between Smart triage and CDM. When JumpSTART triage was used, there was greater accuracy than with either Smart (P<0.001; OR [odds ratio]: 2.03; interquartile range [IQR]: 1.30, 3.17) or CDM (P=0.02; OR: 1.76; IQR: 1.10, 2.82). JumpSTART outperformed Smart for RED patients (P=0.05; OR: 1.48; IQR: 1.01,2.17), and outperformed both Smart (P<0.001; OR: 3.22; IQR: 1.78,5.88) and CDM (P<0.001; OR: 2.86; IQR: 1.53,5.26) for YELLOW patients. Furthermore, JumpSTART outperformed CDM for BLACK patients (P=0.01; OR: 5.55; IQR: 1.47, 20.0).
Conclusion
Our simulation-based comparison suggested that JumpSTART triage outperforms both Smart and CDM. JumpSTART outperformed Smart for RED patients and CDM for BLACK patients. For YELLOW patients, JumpSTART yielded more accurate triage results than did Smart triage or CDM. (Disaster Med Public Health Preparedness. 2016;10:253–260)
Recent findings in social psychology show how implicit affective responses can be changed, leading to strong, fast, and durable updating. This work demonstrates that new information viewed as diagnostic or which prompts reinterpretations of previous learning produces fast revision, suggesting two factors that might be leveraged in clinical settings. Reconsolidation provides a plausible route for making such reasoning possible.
A belief is debased when believing is given a basis that is not proper for knowledge, such as wishful thinking or superstition. The possibility of a debasing demon is the possibility of a maximally powerful agent who aims to prevent knowledge by debasing beliefs. Jonathan Schaffer contends that the debasing demon is a threat to all knowledge. Schaffer does not assess the strength of the skeptical challenge from debasing. It is argued here that debasing does not strengthen any case for skepticism. A debasing demon is possible. We should acknowledge that our beliefs could have been debased, and that this could have been done in an introspectively undetectable way. But acknowledging this leaves us in a position to know that our apparent knowledge is genuine. It does not enhance any reason to think that we lack knowledge.
Disasters are high-stakes, low-frequency events. Telemedicine may offer a useful adjunct for paramedics performing disaster triage. The objective of this study was to determine the feasibility of telemedicine in disaster triage, and to determine whether telemedicine has an effect on the accuracy of triage or the time needed to perform triage.
Methods
This is a feasibility study in which an intervention team of two paramedics used the mobile device Google Glass (Google Inc; Mountain View, California USA) to communicate with an off-site physician disaster expert. The paramedic team triaged simulated disaster victims at the triennial drill of a commercial airport. The simulated victims had preassigned expected triage levels. The physician had an audio-video interface with the paramedic team and was able to observe the victims remotely. A control team of two paramedics performed disaster triage in the usual fashion. Both teams used the SMART Triage System (TSG Associates LLP; Halifax, England), which assigns patients into Red, Yellow, Green, and Black triage categories. The paramedics were video recorded, and their time required to triage was logged. It was determined whether the intervention team and the control team varied regarding accuracy of triage. Finally, the amount of time the intervention team needed to triage patients when telemedicine was used was compared to when that team did not use telemedicine.
Results
The two teams triaged the same 20 patients. There was no significant difference between the two groups in overall triage accuracy (85.7% for the intervention group vs 75.9% for the control group; P = .39). Two patients were triaged with telemedicine. For the intervention group, there was a significant difference in time to triage patients with telemedicine versus those without telemedicine (35.5 seconds; 95% CI, 72.5-143.5 vs 18.5 seconds; 95% CI, 13.4-23.6; P = .041).
Conclusion
There was no increase in triage accuracy when paramedics evaluating disaster victims used telemedicine, and telemedicine required more time than conventional triage. There are a number of obstacles to available technology that, if overcome, might improve the utility of telemedicine in disaster response.
CiceroMX, WalshB, SoladY, WhitfillT, PaesanoG, KimK, BaumCR, ConeDC. Do You See What I See? Insights from Using Google Glass for Disaster Telemedicine Triage. Prehosp Disaster Med. 2015;30(1):1-5.