We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In 2023, Princeton University Press published Richard Langlois’s The Corporation and the Twentieth Century: The History of American Business Enterprise. It is a book of comparable mass to Alfred Chandler’s 1977 The Visible Hand and equally ambitious.1 The erudition is vast. (The bibliography alone runs 78 closely-printed pages. There are 122 pages of equally closely-printed footnotes to the 522-page main text whose own font is not large.) A production such as this seemed worth more than the usual traditional-form reviews, and in the September following its publication, the Penn Economic History Forum put on a symposium to discuss it. Interest was widespread: attendance in the room was agreeably substantial and came from far beyond the seminar’s usual catchment area, and there were requests for the Zoom link to the proceedings from around the world. (The expense was not vast and the ratio of impact to expense was almost certainly favorable relative to ordinary seminars. The economic history community might not suffer from putting on more such events when suitable occasions arise.)
Clinical trials often struggle to recruit enough participants, with only 10% of eligible patients enrolling. This is concerning for conditions like stroke, where timely decision-making is crucial. Frontline clinicians typically screen patients manually, but this approach can be overwhelming and lead to many eligible patients being overlooked.
Methods:
To address the problem of efficient and inclusive screening for trials, we developed a matching algorithm using imaging and clinical variables gathered as part of the AcT trial (NCT03889249) to automatically screen patients by matching these variables with the trials’ inclusion and exclusion criteria using rule-based logic. We then used the algorithm to identify patients who could have been enrolled in six trials: EASI-TOC (NCT04261478), CATIS-ICAD (NCT04142125), CONVINCE (NCT02898610), TEMPO-2 (NCT02398656), ESCAPE-MEVO (NCT05151172), and ENDOLOW (NCT04167527). To evaluate our algorithm, we compared our findings to the number of enrollments achieved without using a matching algorithm. The algorithm’s performance was validated by comparing results with ground truth from a manual review of two clinicians. The algorithm’s ability to reduce screening time was assessed by comparing it with the average time used by study clinicians.
Results:
The algorithm identified more potentially eligible study candidates than the number of participants enrolled. It also showed over 90% sensitivity and specificity for all trials, and reducing screening time by over 100-fold.
Conclusions:
Automated matching algorithms can help clinicians quickly identify eligible patients and reduce resources needed for enrolment. Additionally, the algorithm can be modified for use in other trials and diseases.
Ideally, mosquito control programs (MCPs) use surveillance to target control measures to potentially dangerous mosquito populations. In North Carolina (NC), where there is limited financial support for mosquito control, communities may suffer from mosquito-related issues post-hurricane due to lack of existing MCPs. Here, study objectives were to (1) investigate the emergency response of a subset of NC counties post-Hurricane Florence and (2) develop guidelines and policy recommendations to assist MCPs in post-hurricane mosquito control response.
Methods:
A survey was administered to a subset of eastern NC counties (an area previously impacted by hurricanes) with various levels of MCPs (from none to well-developed).
Results:
All respondents indicated that having Federal Emergency Management Agency (FEMA) training would be helpful in developing a post-hurricane emergency response plan for mosquito control. There was concern related to a lack of knowledge of emergency control methods (eg, aerial/ground, adulticiding/larviciding) post-hurricane. MCP structure (eg, infrastructure, resources, operational plans/policies) could facilitate response activities and help ensure necessary emergency financial support from agencies such as FEMA.
Conclusions:
Mosquito control post-hurricane protects public health. Public health and other agencies can be networking resources for MCPs. Policy recommendations include implementation of routine FEMA assistance training workshops to improve an understanding of processes involved in assistance and reimbursement.
Three joint space algorithms slow the Cartesian path motion when it appears that joint motion is approaching a joint, speed, or acceleration limit. All three algorithms use quadratic curve fitting to predict where the joint motion is heading, followed by a prediction as to how much time would elapse until a limit is reached.
If a joint motion limit is encountered in the time-to-stop the Cartesian motion, these algorithms reduce the Cartesian speed using pulsed speed settings so that the robot or machine tool will have the necessary time to come to a complete stop. The joint space velocity and acceleration control algorithms set the override Cartesian speed to either full or some reduced speed, several times a second. This allows the joints to reach, but not exceed, their maximum velocity and accelerations limit, while remaining within the physical joint limits.
Opioid use disorder is a major public health crisis, and evidence suggests ways of better serving patients who live with opioid use disorder in the emergency department (ED). A multi-disciplinary team developed a quality improvement project to implement this evidence.
Methods
The intervention was developed by an expert working group consisting of specialists and stakeholders. The group set goals of increasing prescribing of buprenorphine/naloxone and providing next day walk-in referrals to opioid use disorder treatment clinics. From May to September 2018, three Alberta ED sites and three opioid use disorder treatment clinics worked together to trial the intervention. We used administrative data to track the number of ED visits where patients were given buprenorphine/naloxone. Monthly ED prescribing rates before and after the intervention were considered and compared with eight nonintervention sites. We considered whether patients continued to fill opioid agonist treatment prescriptions at 30, 60, and 90 days after their index ED visit to measure continuity in treatment.
Results
The intervention sites increased their prescribing of buprenorphine/naloxone during the intervention period and prescribed more buprenorphine/naloxone than the controls. Thirty-five of 47 patients (74.4%) discharged from the ED with buprenorphine/naloxone continued to fill opioid agonist treatment prescriptions 30 days and 60 days after their index ED visit. Thirty-four patients (72.3%) filled prescriptions at 90 days.
Conclusions
Emergency clinicians can effectively initiate patients on buprenorphine/naloxone when supports for this standardized evidence-based care are in place within their practice setting and timely follow-up in community is available.
Motivated by the occurrence of a moderately nearby supernova near the beginning of the Pleistocene, possibly as part of a long-term series beginning in the Miocene, we investigated whether nitrate rainout resulting from the atmospheric ionization of enhanced cosmic ray flux could have, through its fertilizer effect, initiated carbon dioxide drawdown. Such a drawdown could possibly reduce the greenhouse effect and induce the climate change that led to the Pleistocene glaciations. We estimate that the nitrogen flux enhancement onto the surface from an event at 50 pc would be of order 10%, probably too small for dramatic changes. We estimate deposition of iron (another potential fertilizer) and find it is also too small to be significant. There are also competing effects of opposite sign, including muon irradiation and reduction in photosynthetic yield caused by UV increase from stratospheric ozone layer depletion, leading to an ambiguous result. However, if the atmospheric ionization induces a large increase in the frequency of lightning, as argued elsewhere, the amount of nitrate synthesis should be much larger, dominate over the other effects and induce the climate change. More work needs to be done to clarify the effects on lightning frequency.
We sought to conduct a major objective of the CAEP Academic Section, an environmental scan of the academic emergency medicine programs across the 17 Canadian medical schools.
Methods
We developed an 84-question questionnaire, which was distributed to academic heads. The responses were validated by phone by the lead author to ensure that the questions were answered completely and consistently. Details of pediatric emergency medicine units were excluded from the scan.
Results
At eight of 17 universities, emergency medicine has full departmental status and at two it has no official academic status. Canadian academic emergency medicine is practiced at 46 major teaching hospitals and 13 specialized pediatric hospitals. Another 69 Canadian hospital EDs regularly take clinical clerks and emergency medicine residents. There are 31 full professors of emergency medicine in Canada. Teaching programs are strong with clerkships offered at 16/17 universities, CCFP(EM) programs at 17/17, and RCPSC residency programs at 14/17. Fourteen sites have at least one physician with a Master’s degree in education. There are 55 clinical researchers with salary support at 13 universities. Sixteen sites have published peer-reviewed papers in the past five years, ranging from four to 235 per site. Annual budgets range from $200,000 to $5,900,000.
Conclusion
This comprehensive review of academic activities in emergency medicine across Canada identifies areas of strengths as well as opportunities for improvement. CAEP and the Academic Section hope we can ultimately improve ED patient care by sharing best academic practices and becoming better teachers, educators, and researchers.
Designing materials for performance in high-radiation fields can be accelerated through a carefully chosen combination of advanced multiscale modeling paired with appropriate experimental validation. The studies reported in this work, the combined efforts of six universities working together as the Consortium on Cladding and Structural Materials, use that approach to focus on improving the scientific basis for the response of ferritic–martensitic steels to irradiation. A combination of modern modeling techniques with controlled experimentation has specifically focused on improving the understanding of radiation-induced segregation, precipitate formation and growth under radiation, the stability of oxide nanoclusters, and the development of dislocation networks under radiation. Experimental studies use both model and commercial alloys, irradiated with both ion beams and neutrons. Transmission electron microscopy and atom probe are combined with both first-principles and rate theory approaches to advance the understanding of ferritic–martensitic steels.
Trypanosomatids represent the causative agents of major diseases in humans, livestock and plants, with inevitable suffering and economic hardship as a result. They are also evolutionarily highly divergent organisms, and the many unique aspects of trypanosome biology provide opportunities in terms of identification of drug targets, the challenge of exploiting these putative targets and, at the same time, significant scope for exploration of novel and divergent cell biology. We can estimate from genome sequences that the degree of divergence of trypanosomes from animals and fungi is extreme, with perhaps one third to one half of predicted trypanosome proteins having no known function based on homology or recognizable protein domains/architecture. Two highly important aspects of trypanosome biology are the flagellar pocket and the nuclear envelope, where in silico analysis clearly suggests great potential divergence in the proteome. The flagellar pocket is the sole site of endo- and exocytosis in trypanosomes and plays important roles in immune evasion via variant surface glycoprotein (VSG) trafficking and providing a location for sequestration of various invariant receptors. The trypanosome nuclear envelope has been largely unexplored but, by analogy with higher eukaryotes, roles in the regulation of chromatin and most significantly, in controlling VSG gene expression are expected. Here we discuss recent successful proteomics-based approaches towards characterization of the nuclear envelope and the endocytic apparatus, the identification of conserved and novel trypanosomatid-specific features, and the implications of these findings.
Deuterium has a special place in cosmology, nuclear astrophysics, and galactic chemical evolution, because of its unique property that it is only created in the big bang nucleosynthesis while all other processes result in its net destruction. For this reason, among other things, deuterium abundance measurements in the interstellar medium (ISM) allow us to determine the fraction of interstellar gas that has been cycled through stars, and set constraints and learn about different Galactic chemical evolution (GCE) models. However, recent indications that deuterium might be preferentially depleted onto dust grains complicate our understanding about the meaning of measured ISM deuterium abundances. For this reason, recent estimates by Linsky et al. (2006) have yielded a lower bound to the “true”, undepleted, ISM deuterium abundance that is very close to the primordial abundance, indicating a small deuterium astration factor contrary to the demands of many GCE models. To avoid any prejudice about deuterium dust depletion along different lines of sight that are used to determine the “true” D abundance, we propose a model-independent, statistical Bayesian method to address this issue and determine in a model-independent manner the undepleted ISM D abundance. We find the best estimate for the gas-phase ISM deuterium abundance to be (D/H)ISM ≥ (2.0 ± 0.1) × 10−5. Presented are the results of Prodanović et al. (2009).
Mothers who scored zero on the Beck Depression Inventory (N = 25) were compared to “depressed” mothers (high scores on the Beck) (N = 39) and nondepressed mothers (N = 98) during face-to-face interactions with their 5-month-old infants. The interaction videotapes were rated on the Interaction Rating Scales and were coded second-by-second for attentive/affective behavior states. The zero Beck mothers and their infants received lower ratings and were in less positive behavior states (alone or together) than the high scoring Beck “depressed” mother/infant dyads and even more frequently than the nondepressed mother/infant dyads. The lower activity levels, lesser expressivity, and less frequent vocalizing were suggestive of “depressed” behavior in both the mothers and their infants. In addition, the infants of the zero Beck mothers had lower vagal tone and lower growth percentiles (weight, length, and head circumference) than the infants of nondepressed mothers, although they did not differ from the infants of depressed mothers on these measures. These data suggest that mothers who report no depressive symptoms may present as much, if not greater risk, for their infants than mothers who do report depressive symptoms on the Beck Depression Inventory.
This paper is one of the four interrelated action agenda papers resulting from the National Summit on Public Health Legal Preparedness (Summit) convened in June 2007 by the Centers for Disease Control and Prevention and multi-disciplinary partners. Each of the action agenda papers deals with one of the four core elements of legal preparedness: laws and legal authorities; competency in using those laws; coordination of law-based public health actions; and information. Options presented in this paper are for consideration by policymakers and practitioners — in all jurisdictions and all relevant sectors and disciplines — with responsibilities for all-hazards emergency preparedness.
One expert's framing of the mission of public health may help improve understanding of the range of hazards for which to be legally prepared. These hazards include urgent realities — such as chronic disease, injury, disabilities, conventional communicable diseases, and an aging and obese population — and urgent threats, such as pandemic influenza, natural disasters, and terrorism.
A new method of depositing epitaxial ZnO nanocolumns on sputter-coated ZnO substrates is described that utilizes supersaturated zincate species in sodium hydroxide solutions and requires no complexing agents. Uniform arrays of columns are grown reproducibly over entire substrates in 10 to 50 min. Columns are 50 to 2000 nm long and 50 to 100 nm wide. Strict substrate cleaning and/or preparations are not necessary with this method, in contrast to many other techniques. Films grow only on substrates pre-coated with ZnO, not on bare glass or ITOor SnO2-coated glass. Factors affecting the column growth are elucidated and experimental observations are correlated with crystal growth theory.
By
Homa J. Lee, U.S. Geological Survey, Menlo Park, California,
Robert E. Kayen, U.S. Geological Survey, Menlo Park, California,
Brian D. Edwards, U.S. Geological Survey, Menlo Park, California,
Michael E. Field, U.S. Geological Survey, Menlo Park, California,
James V. Gardner, U.S. Geological Survey, Menlo Park, California,
William C. Schwab, U.S. Geological Survey, Woods Hole, Massachusetts,
David C. Twichell, U.S. Geological Survey, Woods Hole, Massachusetts
The use of sidescan sonar technology has greatly expanded in recent years. One impediment to interpreting sidescan sonar images, which are a representation of the amount of sound backscattered from the seafloor, is the incomplete understanding of the physical meaning of acoustic backscatter intensity variations. Ground-truth studies can help us to understand the causes of variations in backscatter. We need to measure physical and geometric properties of seafloor sediment and correlate them with variations in sidescan sonar acoustic backscatter. We present in this paper comparative ground-truth studies of two deep-sea fan depositional lobes. We show that sediment lithology influences sidescan sonar images, but that the relation between backscatter intensity and sediment grain size is not uniquely defined.
Some of the seafloor characteristics that are potential causes of variations in acoustic backscatter intensity are surface roughness, variations in sediment composition, grazing angle of insonification, and seafloor slope, including topographic variability (Urick 1983). The influence of each of these and the subbottom depth range over which sediment compositional variations are important will vary with the characteristics of the sidescan system, including frequency, pulse length, bandwidth, time-varying gains, and footprint size. The number of variables needs to be kept to a minimum in order to simplify a ground-truth study. The distal parts of deep-sea fans are good locations for such studies because they tend to have nearly horizontal seafloor surfaces. Thus, the topographic variability – bottom slope effect can be ignored.
By
Brian D. Edwards, U.S. Geological Survey, Menlo Park, California,
Michael E. Field, U.S. Geological Survey, Menlo Park, California,
Neil H. Kenyon, Institute of Oceanographic Sciences, Southampton, United Kingdom
Long-range sidescan sonographs from the GLORIA sidescan sonar system provide a new perspective on the morphology and sediment distribution of small active submarine fans in the Santa Monica and San Pedro Basins of the California Continental Borderland. These sonographs, combined with 3.5-kHz seismic-reflection profiles, depict elongate submarine fan systems characterized by intermediate acoustic backscatter in the middle fan region and low backscatter in the distal reaches where the lower fan feeds onto the high-backscatter central basin plain. The fans are fed by low-backscatter channels that originate in shallow water at the northwest corner of each basin. These channels subsequently branch downstream into a system of smaller channels and lineations that extend to the tips of the distal-most deposits. In these distal reaches, fan deposition occurs in low-relief (1 to 2 m), tapering, low-backscatter fingerlike distributaries that extend to the high-backscatter central basin. The low-backscatter fingers are lens-shaped in cross section. Topographic lows occurring between adjacent fingers apparently direct the transport of subsequent flows with resulting shifts of the depocenters over time.
Core samples show that turbidity currents have deposited coarse sediment beyond the mid- and lower-fan environments and onto the western part of the flat central floor of both basins. The cores, combined with bottom photographs and high-resolution seismic-reflection profiles, indicate that patterns observed in sedimented areas on the GLORIA mosaic are caused largely by scattering from volume inhomogeneities and subbottom interfaces of sediment layers within the upper few meters of the sediment column.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.