We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Mainstream economists give the misleading impression that their argument for austerity is purely technical and indeed the most ‘scientific’. The argument developed here is that their reasoning is not, any more than that of their heterodox critics, independent of ideology, power and ethics. The widespread belief in austerity policies as scientifically justified has prevented arguments against austerity gaining more traction; issues of ideology, power and ethics need to be brought to the fore as part of the arguments on both sides. In other words, awareness of the epistemological issues arising from an open-system ontology is critical to understanding the crisis and the policy response, and therefore to challenging that understanding and encouraging a radical policy shift. The critique of austerity policies would therefore be strengthened by a critique of the mainstream’s rhetorical (mis)representation of economic theorising.
Introduction: Although acute gastroenteritis is an extremely common childhood illness, there is a paucity of literature characterizing the associated pain and its management. Our primary objective was to quantify the pain experienced by children with acute gastroenteritis in the 24-hours prior to emergency department (ED) presentation. Secondary objectives included describing maximum pain, analgesic use, discharge recommendations, and factors that influenced analgesic use in the ED. Methods: Study participants were recruited into this prospective cohort study by the Alberta Provincial Pediatric EnTeric Infection TEam between January 2014 and September 2017. This study was conducted at two Canadian pediatric EDs; the Alberta Children's Hospital (Calgary) and the Stollery Children's Hospital (Edmonton). Eligibility criteria included < 18 years of age, acute gastroenteritis (□ 3 episodes of diarrhea or vomiting in the previous 24 hours), and symptom duration □ 7 days. The primary study outcome, caregiver-reported maximum pain in the 24-hours prior to presentation, was assessed using the 11-point Verbal Numerical Rating Scale. Results: We recruited 2136 patients, median age 20.8 months (IQR 10.4, 47.4); 45.8% (979/2136) female. In the 24-hours prior to enrolment, 28.6% (610/2136) of caregivers reported that their child experienced moderate (4-6) and 46.2% (986/2136) severe (7-10) pain in the preceding 24-hours. During the emergency visit, 31.1% (664/2136) described pain as moderate and 26.7% (571/2136) as severe. In the ED, analgesia was provided to 21.2% (452/2131) of children. The most commonly administered analgesics in the ED were ibuprofen (68.1%, 308/452) and acetaminophen (43.4%, 196/452); at home, acetaminophen was most commonly administered (77.7%, 700/901), followed by ibuprofen (37.5%, 338/901). Factors associated with analgesia use in the ED were greater pain scores during the visit, having a primary-care physician, shorter illness duration, fewer diarrheal episodes, presence of fever and hospitalization. Conclusion: Although children presenting to the ED with acute gastroenteritis experience moderate to severe pain, both prior to and during their emergency visit, analgesic use is limited. Future research should focus on appropriate pain management through the development of effective and safe pain treatment plans.
Following pioneering work in Norway, cirque glaciers have widely been viewed as rigidly rotating bodies. This model is incorrect for basin-filling cirque glaciers, as we have demonstrated at West Washmawapta Glacier, a small glacier in the Canadian Rocky Mountains. Here we report observations at the same glacier that assess whether complex temporal variations of flow also occur. For parts of three summers, we measured daily displacements of the glacier surface. In one year, four short-duration speed-up events were recorded. Three of the events occurred during the intervals of warmest weather, when melt was most rapid; the fourth event occurred immediately following heavy rain. We interpret the speed-up events as manifestations of enhanced water inputs to the glacier bed and associated slip lubrication by increased water volumes and pressures. No further speed-ups occurred in the final month of the melt season, despite warm temperatures and several rainstorms; the dominant subglacial water system likely transformed from one of poorly connected cavities to one with an efficient channel network. The seasonal evolution of hydrology and flow resembles behaviors documented at other, larger temperate glaciers and indicates that analyses of cirque erosion cannot rely on simple assumptions about ice dynamics.
The specification of central banking functions and the institutional arrangements within which these functions are performed are open for discussion. The need for changes in central bank goals and operations in the face of the financial crisis has opened up issues which, during the Great Moderation period, had been regarded as long settled.
One of these issues concerns the goals of central banks. While inflation targeting had been applied for some time by many central banks, the new economic environment seems to many to warrant an alternative; candidates for an alternative include nominal gross domestic product (GDP) targeting, unemployment targeting, and promoting financial stability. But here we consider the merits of a range of goals pursued simultaneously, with potential conflicts addressed by means of judgment. Further, while we will emphasize the importance of the role of central banks, it will be recognized that they cannot exercise direct control, either of macroeconomic aggregates or of the financial sector. In attempting to achieve their goal(s), central banks have always had to tailor their policy instruments to the changing character and strength of financial markets. The liberalization and globalization of finance since the 1970s, and then the financial crisis, dramatically altered both the scale and sophistication of financial markets and the relationship between central banks and financial institutions. But, while such developments challenge the effectiveness of traditional instruments of monetary policy, care must be taken not to exaggerate the power previously exercised by central banks. Even before the 1970s, central banks could only influence the level of credit and money in the economy, not control it.
A further issue concerns the institutional arrangements within which central banks pursue their goals and in particular the case for central bank independence. The return of central banks to major open-market operations in sovereign debt has eroded previous efforts to separate monetary policy and fiscal policy. At the same time, central bank activities have had their own direct political consequences, including substantial redistribution of income. There are good reasons therefore to revisit the presumption that central bank independence is beneficial.
L-tryptophan 50 mg/kg was administered orally to patients suffering from either unipolar or bipolar affective illness, and the concentration of 5-hydroxyindol-acetic acid estimated in their cerebrospinal fluid eight hours later. There was no significant difference between the patient groups or between these and patients with neurological disease. These findings suggest a reduced neuronal activity in the 5-hydroxytryptaminergic system in some depressed patients rather than an absolute deficiency of tryptophan-5-hydroxylase. The synthesis of 5-HIAA in response to tryptophan varied with age.
Experimental infections of lambs with Fasciola hepatica are described. The growth rate of the parasite, time of entry to the bile ducts, and time of patency are recorded and a preferential migration of the parasite in the liver parenchyma noted. The gross and histological lesions produced in the liver from 1 to 40 weeks after infection are described and compared with previous observations in cattle.
The parenchymal migration of the parasite is shown to consist of two phases, a free migrating phase up to the 6th week, and a localized phase after the 6th week prior to entry into the bile ducts. Hepatic cell regeneration is observed and hepatic fibrosis is minimal. The localized phase of migration is associated with a unique peripheral palisade of giant cells in the fluke tracts and with the formation of pseudofollicular aggregation of lymphocytes. The presence of flukes in the bile ducts produces fibrosis of the duct walls. The walls, however, remain pliable and expanded to accommodate the parasites and calcification was never observed.
Three years following the global outbreak of severe acute respiratory syndrome (SARS), a national, Web-based survey of Canadian nurses was conducted to assess perceptions of preparedness for disasters and access to support mechanisms, particularly for nurses in emergency and critical care units.
Hypotheses:
The following hypotheses were tested: (1) nurses' sense of preparedness for infectious disease outbreaks and naturally occurring disasters will be higher than for chemical, biological, radiological, and nuclear (CBRN)-type disasters associated with terrorist attacks; (2) perceptions of preparedness will vary according to previous outbreak experience; and (3) perceptions of personal preparedness will be related to perceived institutional preparedness.
Methods:
Nurses from emergency departments and intensive care units across Canada were recruited via flyer mailouts and e-mail notices to complete a 30-minute online survey.
Results:
A total of 1,543 nurses completed the survey (90% female; 10% male). The results indicate that nurses feel unprepared to respond to large-scale disasters/attacks. The sense of preparedness varied according to the outbreak/disaster scenario with nurses feeling least prepared to respond to a CBRN event. A variety of socio-demographic factors, notably gender, previous outbreak experience (particularly with SARS), full-time vs. part-time job status, and region of employment also were related to perceptions of risk. Approximately 40% of respondents were unaware if their hospital had an emergency plan for a large-scale outbreak. Nurses reported inadequate access to resources to support disaster response capacity and expressed a low degree of confidence in the preparedness of Canadian healthcare institutions for future outbreaks.
Conclusions:
Canadian nurses have indicated that considerably more training and information are needed to enhance preparedness for frontline healthcare workers as important members of the response community.
By
Brian C. Dow, Consultant Clinical Microbiologist; Head, Scottish National Blood Transfusion Service, National Microbiology Reference Unit, West of Scotland, Transfusion Centre, Glasgow, UK,
Eberhard W. Fiebig, Associate Professor/Vice Chair, UCSF Department of Laboratory Medicine; Chief, Laboratory Medicine Service, San Francisco General Hospital, San Francisco, California, USA,
Michael P. Busch, Director, Blood Systems Research Institute; Vice President Research and Scientific Programs, Blood Systems, Inc.; Professor of Laboratory Medicine, University of California, USA
Retroviruses have a wide distribution in nature, with examples in insects, reptiles and nearly all mammals. The human retrovirus, human immunodeficiency virus (HIV 1 and 2), belongs to the lentivirus group of the retrovirus family, whilst human T-cell lymphotropic virus (HTLV I and II) belongs to the oncorna group. Human T-cell lymphotropic virus I and II are thought to have evolved from simian T-lymphotropic retroviruses that were transmitted to humans over the past centuries or millenia. Human immunodeficiency virus is thought to have derived from simian immunodeficiency viruses that are endemic in chimpanzees in Central Africa, and probably infected natives over the past century (Sharp et al., 2001).
Retroviruses are membrane-coated, single stranded RNA viruses that have a distinct genomic organization and require the presence of reverse transcriptase in their replication cycle. In a typical infection, retrovirus particles attach to the cell membrane, reverse transcriptase copies viral RNA into complementary double stranded DNA and this is integrated into the host cell chromosome. Host cell enzymes help virus and host regulatory genes complete the retrovirus lifecycle by producing virions that bud from the plasma membrane to infect other cells or organisms.
Human immunodeficiency viruses 1 and 2
Definition and characteristics of agent
Human immunodeficiency virus was discovered in the early 1980s by two groups of workers, Montagnier in France and Gallo in the USA. Originally described as human T cell lymphotropic virus type III (HTLV-III), the virus was shown to infect T-cell lymphocytes.
By
Carl P. McDonald, Head of Bacteriology, NHS Blood and Transplant Colindale, London, UK,
M. A. Blajchman, Canadian Blood Services and McMaster University, Hamilton, Ontario, Canada,
Brian C. Dow, Consultant, Clinical Microbiologist; Head, Scottish National Blood Transfusion Service, National Microbiology Reference Unit, West of Scotland, Transfusion Centre, Glasgow, UK
Bacterial transmission remains a significant problem in transfusion medicine. This issue is not a new problem and was first identified more than 60 years ago with the first report of a bacterial transfusion-transmission from a blood component in 1941 (Novak, 1939; Strumia and McGraw, 1941). Since the 1970s remarkable progress has been made in increasing the safety of the blood supply with regard to viruses. Unfortunately, this has not been the case with bacterial contamination. Moreover, the continued emphasis in striving for ‘zero risk’ with regard to blood-borne viruses and in measures to prevent the ‘potential’ problem of prion transmission has possibly been to the detriment of resolving the issue of bacterial contamination. The current risk of receiving bacterially contaminated platelet concentrates, however, may be 1000 times higher than the combined risk of transfusion-transmitted infection with the human immunodeficiency virus (HIV), hepatitis C virus, hepatitis B virus and human T-cell lymphotropic virus (HTLV) (Blajchman, 2002).
In the USA, from 1985 to 1999, bacterial contamination was the most frequently reported cause of mortality after haemolytic reactions, accounting for over 10% (77/694) of transfusion fatalities (Centre for Biologics Evaluation and Research, 1999). From 1986 to 1991, 29 out of 182 (16%) transfusion-associated fatalities reported to the USA Food and Drug Administration (FDA) were caused by bacterial contamination of blood components (Hoppe, 1992).
From 1994 to 1998, the French Haemovigilance system attributed 18 deaths (four occurring in 1997) to blood components contaminated with bacteria (Debeir et al., 1999; Morel, 1999a).
By
Alan D. Kitchen, Head, National Transfusion Microbiology Reference Laboratory, NHS Blood and Transplant Colindale, London, UK,
Brian C. Dow, Consultant, Clinical Microbiologist; Head, Scottish National Blood Transfusion Service, National Microbiology Reference Unit, West of Scotland, Transfusion Centre, Glasgow, UK
The major focus in ensuring the microbiological safety of the blood supply relies heavily on the primary screening of donated blood. Although routine donor screening assays are highly sensitive, this sensitivity is often achieved at the expense of specificity (0.05–0.5%) (Dow, 2000).
Blood donations found to be initially reactive at donor testing sites should be repeat tested in duplicate. Should any of the repeat tests result in reactivity, the donation is classified as ‘repeatedly reactive’, the donor is flagged on the donor database and samples are submitted to the designated national reference laboratory or other designated facility. Regardless of confirmatory test results, the donation and all its associated components will be excluded from transfusion.
Throughout the world, blood services have differing policies with regard to confirmation of microbiology reactive donations. Most developed countries' services are capable of performing adequate confirmation of reactive donations. However, some services use an alternative strategy of reporting reactivity directly to the donors, often resulting in considerable donor anxiety and potential personal expense to reach a confirmatory conclusion. Obviously, in areas of high endemicity, there is a higher predictive value associated with a repeat reactive result and in this situation, simpler confirmatory algorithms can be utilized. Generally though, in developed countries, donors have relatively low prevalences of infection and therefore more complex confirmatory algorithms, like those described in this chapter, are often necessary before notification to the apparently healthy volunteer donor.
Although the effectiveness of cognitive behavioural therapy (CBT) in the management of panic disorder (PD) is now well established, there have been few studies of predictors of outcome with this patient group using clinical effectiveness trial data, a hypothesis-testing model, and a dependent measure of clinically significant change.
Method
The data for this study came from a randomized controlled trial of three forms of CBT delivery for PD with and without agoraphobia (two 6-week CBT programmes, one of which was computer assisted, and one therapist-directed 12-week CBT programme), comprising a total of 186 patients across two sites. Based on previous related research, five hypothesized predictors of post-treatment and follow-up outcome were identified and examined, using a series of bivariate and multivariate analyses.
Results
The results in general supported the hypotheses. Strength of blood/injury fears, age of initial onset of panic symptoms, co-morbid social anxieties and degree of agoraphobic avoidance were predictive of both measures of post-treatment outcome. Degree of residual social difficulties and the continued use of anxiolytics at post-treatment were also shown to predict poor outcome at the 6-month follow-up. However, strength of continuing dysfunctional agoraphobic cognitions by the end of active treatment did not predict outcome at follow-up for the sample as a whole.
Conclusions
The identification of consistent predictors of outcome with CBT has many clinical and research benefits. As CBT, however, is being delivered increasingly in a variety of brief formats, further research is required to identify moderators of response to these ‘non-standard’ treatment formats.
Despite the growth of reduced therapist-contact cognitive behavioural therapy (CBT) programmes, there have been few systematic attempts to determine prescriptive indicators for such programmes vis-à-vis more standard forms of CBT delivery. The present study aimed to address this in relation to brief (6-week) and standard (12-week) therapist-directed CBT for panic disorder (PD) with and without agoraphobia. Higher baseline levels of severity and associated disability/co-morbidity were hypothesized to moderate treatment effects, in favour of the 12-week programme.
Method
Analyses were based on outcome data from two out of three treatment groups (n=72) from a recent trial of three forms of CBT delivery for PD. The dependent variables were a continuous composite panic/anxiety score and a measure of clinical significance. Treatment×predictor interactions were examined using multiple and logistic regression analyses.
Results
As hypothesized, higher baseline severity, disability or co-morbidity as indexed by strength of dysfunctional agoraphobic cognitions; duration of current episode of PD; self-ratings of panic severity; and the 36-item Short Form Health Survey (SF-36) (Mental component) score were all found to predict poorer outcome with brief CBT. A similar trend was apparent in relation to baseline level of depression. With high and low end-state functioning as the outcome measure, however, only the treatment×agoraphobic cognitions interaction was found to be significant.
Conclusions
While there was no evidence that the above variables necessarily contraindicate the use of brief CBT, they were nevertheless associated with greater overall levels of post-treatment improvement with the 12-week approach.
A 71-year-old man with Stage II gastric cancer developed rapid onset radiation induced liver disease after ceasing adjuvant chemotherapy and radiotherapy. Autopsy revealed moderate hepatocellular iron overload. Posthumously, he was found to be a compound heterozygote for hereditary haemochromatosis. Since both radiation and iron overload may induce liver damage through the activation of hepatic stellate cells, it is possible that hepatocellular iron overload may potentiate the effects of irradiation and predispose the patient to radiation induced liver disease.
What are the prospects for progress in heterodox economics? The question posed for this roundtable discussion raises a wide range of important issues, for heterodox economics and for economics as a whole. In order to clarify some of these issues, this contribution approaches the question from a practical point of view: what strategy would promote the progress of heterodox economics?
The central theme of this session is the changing relationship between “orthodox” (i.e., mainstream, neoclassical) and “heterodox” economics, especially in the USA, during the past two or three decades. Economics is such a large and heterogeneous discipline that it cannot be characterized both briefly and accurately. Alongside the growth of formalization and mathematization, and the high degree of uniformity in the undergraduate and graduate curricula and in the leading textbooks, there are also within the subject a number of dissenting or deviant doctrinal schools, rival methodological approaches, and innovative developments designed to remedy its defects and/or overcome its limitations. Moreover, many of the outspoken criticisms of the status quo, proposed remedies, and innovations, originate with or are endorsed by prominent economists with impeccable professional credentials. Indeed, in some cases their contributions threaten the discipline's foundations and can, therefore, be considered a species of “orthodox subversion.”
Muscle spindles in 2 synergistic avian skeletal muscles, the anterior (ALD) and posterior (PLD) latissimus dorsi, were studied by light and electron microscopy to determine whether morphological or quantitative differences existed between these sensory receptors. Differences were found in the density, distribution and location of muscle spindles in the 2 muscles. They also differed with respect to the morphology of their capsules and intracapsular components. The slow ALD possessed muscle spindles which were evenly distributed throughout the muscle, whereas in the fast PLD they were mainly concentrated around the single nerve entry point into the muscle. The muscle spindle index (number of spindles per gram wet muscle weight) in the ALD was more than double that of its fast-twitch PLD counterpart (130.5±2.0 vs 55.4±2.0 respectively, n=6). The number of intrafusal fibres per spindle ranged from 1 to 8 in the ALD and 2 to 9 in the PLD, and their diameters varied from 5.0 to 16.0 μm and 4.5 to 18.5 μm, respectively. Large diameter intrafusal fibres were more frequently encountered in spindles of the PLD. Unique to the ALD was the presence of monofibre muscle spindles (12.7% of total spindles observed in ALD) which contained a solitary intrafusal fibre. In muscle spindles of both the ALD and PLD, sensory nerve endings terminated in a spiral fashion on the intrafusal fibres in their equatorial regions. Motor innervation was restricted to either juxtaequatorial or polar regions of the intrafusal fibres. Outer capsule components were extensive in polar and juxtaequatorial regions of ALD spindles, whereas inner capsule cells of PLD spindles were more numerous in juxtaequatorial and equatorial regions. Overall, muscle spindles of the PLD exhibited greater complexity with respect to the number of intrafusal fibres per spindle, range of intrafusal fibre diameters and development of their inner capsules. It is postulated that the differences in muscle spindle density and structure observed in this study reflect the function of the muscles in which they reside.
Although food handlers are often implicated as the source of infection in outbreaks of food-borne viral gastroenteritis, little is known about the timing of infectivity in relation to illness. We investigated a gastroenteritis outbreak among employees of a manufacturing company and found an association (RR=14·1, 95% CI=2·0–97·3) between disease and eating sandwiches prepared by 6 food handlers, 1 of whom reported gastroenteritis which had subsided 4 days earlier. Norwalk-like viruses were detected by electron microscopy or reverse transcriptase-polymerase chain reaction (RT-PCR) in stool specimens from several company employees, the sick food handler whose specimen was obtained 10 days after resolution of illness, and an asymptomatic food handler. All RT-PCR product sequences were identical, suggesting a common source of infection. These data support observations from recent volunteer studies that current recommendations to exclude food handlers from work for 48–72 h after recovery from illness may not always prevent transmission of Norwalk-like viruses because virus can be shed up to 10 days after illness or while exhibiting no symptoms.