We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To compare the agreement and cost of two recall methods for estimating children’s minimum dietary diversity (MDD).
Design:
We assessed child’s dietary intake on two consecutive days: an observation on day one, followed by two recall methods (list-based recall and multiple-pass recall) administered in random order by different enumerators at two different times on day two. We compared the estimated MDD prevalence using survey-weighted linear probability models following a two one-sided test equivalence testing approach. We also estimated the cost-effectiveness of the two methods.
Setting:
Cambodia (Kampong Thom, Siem Reap, Battambang, and Pursat provinces) and Zambia (Chipata, Katete, Lundazi, Nyimba, and Petauke districts).
Participants:
Children aged 6–23 months: 636 in Cambodia and 608 in Zambia.
Results:
MDD estimations from both recall methods were equivalent to the observation in Cambodia but not in Zambia. Both methods were equivalent to the observation in capturing most food groups. Both methods were highly sensitive although the multiple-pass method accurately classified a higher proportion of children meeting MDD than the list-based method in both countries. Both methods were highly specific in Cambodia but moderately so in Zambia. Cost-effectiveness was better for the list-based recall method in both countries.
Conclusion:
The two recall methods estimated MDD and most other infant and young child feeding indicators equivalently in Cambodia but not in Zambia, compared to the observation. The list-based method produced slightly more accurate estimates of MDD at the population level, took less time to administer and was less costly to implement.
Survival and mortality of extensive hill ewes are important commercial factors and valuable indicators of welfare, but difficult to study. Body condition scoring is a tried-and-tested management and monitoring tool which has been defined as a good predictor of fitness in ewes and is easily measured under hill farm conditions. This paper presents the results of a study on ewe survival rates in hill conditions in Scotland. Ewe performance and survival rates were measured in two contrasting hill flocks over a five-year period. Statistical analysis showed that mid-pregnancy body condition score was the most reliable indicator of subsequent ewe survival, followed closely by age of the ewe and environmental and management conditions. This study confirms that there are considerable welfare issues related to hill flocks and that improved winter nutritional management is a key component to good welfare and productivity. It also reinforces the view that body condition score is a good quantitative predictor of animal welfare and that poor mid-winter score indicates high risk of mortality, both at the flock and individual ewe level.
Veterans’ Affairs (VA) healthcare providers perceive that Veterans expect and base visit satisfaction on receiving antibiotics for upper respiratory tract infections (URIs). No studies have tested this hypothesis. We sought to determine whether receiving and/or expecting antibiotics were associated with Veteran satisfaction with URI visits.
Methods:
This cross-sectional study included Veterans evaluated for URI January 2018–December 2019 in an 18-clinic ambulatory VA primary-care system. We evaluated Veteran satisfaction via the Patient Satisfaction Questionnaire Short Form (RAND Corporation), an 18-item 5-point Likert scale survey. Additional items assessed Veteran antibiotic expectations. Antibiotic receipt was determined via medical record review. We used multivariable regression to evaluate whether antibiotic receipt and/or Veteran antibiotic expectations were associated with satisfaction. Subgroup analyses focused on Veterans who accurately remembered antibiotic prescribing during their URI visit.
Results:
Of 1,329 eligible Veterans, 432 (33%) participated. Antibiotic receipt was not associated with differences in mean total satisfaction (adjusted score difference, 0.6 points; 95% confidence interval [CI], −2.1 to 3.3). However, mean total satisfaction was lower for Veterans expecting an antibiotic (adjusted score difference −4.4 points; 95% CI −7.2 to −1.6). Among Veterans who accurately remembered the visit and did not receive an antibiotic, those who expected an antibiotic had lower mean satisfaction scores than those who did not (unadjusted score difference, −16.6 points; 95% CI, −24.6 to −8.6).
Conclusions:
Veteran expectations for antibiotics, not antibiotic receipt, are associated with changes in satisfaction with outpatient URI visits. Future research should further explore patient expectations and development of patient-centered and provider-focused interventions to change patient antibiotic expectations.
The completion of a laser safety course remains a core surgical curriculum requirement for otolaryngologists training in the UK. This project aimed to develop a comprehensive laser safety course utilising both technical and non-technical skills simulation.
Methods
Otolaryngology trainees and consultants from the West of Scotland Deanery attended a 1-day course comprising lectures, two high-fidelity simulation scenarios and a technical simulation of safe laser use in practice.
Results
The course, and in particular the use of simulation training, received excellent feedback from otolaryngology trainees and consultants who participated. Both simulation scenarios were validated for future use in laser simulation.
Conclusion
The course has been recognised as a laser safety course sufficient for the otolaryngology Certificate of Completion of Training. To the authors’ knowledge, this article represents the first description of using in situ non-technical skills simulation training for teaching laser use in otolaryngology.
The aquatic diplocaulid nectridean Keraterpeton galvani is the commonest taxon represented in the Jarrow Coal assemblage from Kilkenny, Ireland. The Jarrow locality has yielded the earliest known Carboniferous coal-swamp fauna in the fossil record and is, therefore, of importance in understanding the history and diversity of the diplocaulid clade. The morphology of Keraterpeton is described in detail with emphasis on newly observed anatomical features. A reconstruction of the palate includes the presence of interpterygoid vacuities and new morphological details of the pterygoid, parasphenoid and basicranial region. The hyoid apparatus comprising an ossified basibranchial element has not been reported previously in nectrideans. The structure of the scapulocoracoid and primitive nature of the humerus is described and the presence of a five-digit manus confirmed. Previously unrecognised accessory dermal ossifications are present in the pectoral girdle. Keraterpeton longtoni from the Bolsovian in Staffordshire, England, is also described and newly figured. The primitive condition in diplocaulids is defined on the basis of the earliest occurrence at Jarrow and discussed in relation to functional morphology and mode of life. The evolution of the diplocaulid clade is assessed in relation to the revised diagnoses that define the primitive condition in Keraterpeton.
To investigate the practice of hunting by local people in the southern Bahia region of Brazil and provide information to support the implementation of the National Action Plan for Conservation of the Central Atlantic Forest Mammals, we conducted 351 interviews with residents of three protected areas and a buffer zone. Thirty-seven percent of respondents stated that they had captured an animal opportunistically, 16% hunted actively and 47% did not hunt. The major motivation for hunting was consumption but people also hunted for medicinal purposes, recreation and retaliation. The most hunted and consumed species were the paca Cuniculus paca, the nine-banded armadillo Dasypus novemcinctus and the collared peccary Pecari tajacu; threatened species were rarely hunted. Opinions varied on whether wildlife was declining or increasing; declines were generally attributed to hunting. Our findings suggest there is illegal hunting for consumption in and around protected areas of the region. Management efforts should prioritize fairness in the expropriation process for people who must be relocated, and adopt an approach to wildlife management that involves residents living around the protected areas, and considers their needs.
The aim of this analysis was to establish the basic mechanical principles of simple archosaur cranial form. In particular we estimated the influence of two key archosaur innovations, the secondary palate and the antorbital fenestra, on the optimal resistance of biting-induced loads. Although such simplified models cannot substitute for more complex cranial geometries, they can act as a clearly derived benchmark that can serve as a reference point for future studies incorporating more complex geometry. We created finite element (FE) models comprising either a tall, domed (oreinirostral) snout or a broad, flat (platyrostral) archosaur snout. Peak von Mises stress was recorded in models with and without a secondary palate and/or antorbital fenestra after the application of bite loads to the tooth row. We examined bilateral bending and unilateral torsion-inducing bites for a series of bite positions along the jaw, and conducted a sensitivity analysis of material properties. Pairwise comparison between different FE morphotypes revealed that oreinirostral models are stronger than their platyrostral counterparts. Oreinirostral models are also stronger in bending than in torsion, whereas platyrostral models are equally susceptible to either load type. As expected, we found that models with a fenestra always have greatest peak stresses and by inference are “weaker,” significantly so in oreinirostral forms and anterior biting platyrostral forms. Surprisingly, although adding a palate always lowers peak stress, this is rarely by large magnitudes and is not significant in bilateral bending bites. The palate is more important in unilateral torsion-inducing biting. Two basic principles of archosaur cranial construction can be derived from these simple models: (1) forms with a fenestra are suboptimally constructed with respect to biting, and (2) the presence or absence of a palate is significant to cranial integrity in unilaterally biting animals. Extrapolating these results to archosaur cranial evolution, it appears that if mechanical optimization were the only criterion on which skull form is based, then most archosaurs could in theory strengthen their skulls to increase resistance to biting forces. These strengthened morphotypes are generally not observed in the fossil record, however, and therefore archosaurs appear subject to various non-mechanical morphological constraints. Carnivorous theropod dinosaurs, for example, may retain large suboptimal fenestra despite generating large bite forces, owing to an interplay between craniofacial ossification and pneumatization. Furthermore, living crocodylians appear to strengthen their skull with a palate and filled fenestral opening in the most efficient way possible, despite being constrained perhaps by hydrodynamic factors to the weaker platyrostral morphotype. The future challenge is to ascertain whether these simple predictions are maintained when the biomechanics of complex cranial geometries are explored in more detail.
Southeast Asia has sometimes been portrayed as a static place. In the ninth to fourteenth centuries, however, the region experienced extensive trade, bitter wars, kingdoms rising and falling, ethnic groups on the move, the construction of impressive monuments and debate about profound religious issues. Readers of this volume will learn much of how people lived in Southeast Asia five hundred to one thousand years ago; the region today cannot be comprehended without reference to the seminal developments of that period.
Neurocognitive impairment is a frequent complication of HIV infection and heralds a poor survival prognosis. With the availability of highly active antiretroviral therapy (HAART), survival times for HIV-infected patients have markedly increased although the effects of HAART on the prevalence of neurocognitive impairment remain uncertain.
Objective:
To determine the relationship between self-reported neurocognitive symptoms and neuropsychological (NP) performance together with the impact of HAART among HIV-infected patients.
Methods:
A cross-sectional study was performed in which patients without previously documented neurocognitive impairment attending an HIV community clinic were questioned about neurocognitive symptoms and a NP test battery was administered.
Results:
Of the eighty-three patients examined, neurocognitive symptoms were reported by 34% of patients and were associated with a shorter duration of HAART and higher viral loads. Patients reporting neurocognitive symptoms were also more likely to exhibit impaired NP performance (p<0.005) with NP impairment being detected in 46% of all patients examined (12% with HIV-associated dementia). Neuropsychological impairment was directly correlated with age (p<0.001), plasma viral load (p<0.005) and inversely correlated with the number of prescribed antiretroviral drugs (p<0.01).
Conclusion:
These results suggest that neurocognitive symptoms are predictive of impaired NP performance and that NP impairment remains a frequent finding among older patients with higher viral loads. An increased number of antiretroviral drugs may be neuroprotective.
A year-long intervention trial was conducted to characterise the responses of multiple biomarkers of Se status in healthy American adults to supplemental selenomethionine (SeMet) and to identify factors affecting those responses. A total of 261 men and women were randomised to four doses of Se (0, 50, 100 or 200 μg/d as l-SeMet) for 12 months. Responses of several biomarkers of Se status (plasma Se, serum selenoprotein P (SEPP1), plasma glutathione peroxidase activity (GPX3), buccal cell Se, urinary Se) were determined relative to genotype of four selenoproteins (GPX1, GPX3, SEPP1, selenoprotein 15), dietary Se intake and parameters of single-carbon metabolism. Results showed that supplemental SeMet did not affect GPX3 activity or SEPP1 concentration, but produced significant, dose-dependent increases in the Se contents of plasma, urine and buccal cells, each of which plateaued by 9–12 months and was linearly related to effective Se dose (μg/d per kg0·75). The increase in urinary Se excretion was greater for women than men, and for individuals of the GPX1 679 T/T genotype than for those of the GPX1 679 C/C genotype. It is concluded that the most responsive Se-biomarkers in this non-deficient cohort were those related to body Se pools: plasma, buccal cell and urinary Se concentrations. Changes in plasma Se resulted from increases in its non-specific component and were affected by both sex and GPX1 genotype. In a cohort of relatively high Se status, the Se intake (as SeMet) required to support plasma Se concentration at a target level (Sepl-target) is: .
The Okenyenya igneous complex is one of a suite of intrusions which define a prominent northeast-trending linear feature in Damaraland, northwestern Namibia. Precise Rb–Sr internal isochron ages range from 128.6 ± 1 to 123.4 ± 1.4 Ma for the major phases of intrusion identified within the complex. The tholeiitic gabbros forming the outer rings of the complex, and the later alkali gabbros which form the central hills, cannot be distinguished in terms of Rb–Sr ages, although field relations clearly indicate the younger age of the latter. The intrusionsof nepheline-syenite and essexite comprising the mountain of Okenyenya Bergon the northern edge of the complex give ages of 123.4 ± 1.4 and 126.3 ± 1 Ma, respectively, and form the final major phase of intrusion. The ages obtained for early and late intrusive phases define a minimum magmatic ‘life-span’ of approximately 5 Ma for the complex. The determined age of the Okenyenya igneous complex (129–123 Ma), when taken together with the few reliable published ages for other Damaraland complexes (130–134 Ma), suggests that these sub-volcanic complexes were emplaced contemporaneously with the widespread Etendeka volcanics (˜ 130 Ma), and relate to magmatism associated with the breakup of southern Africa and South America with the opening of the South Atlantic Ocean. The linear distributionof intrusions in Damaraland is interpreted to be due to magmatism resultingfrom the upwelling Tristan plume being focused along a structural discontinuity between the Pan-African, Damaran terrain to the south, and Proterozoiccratonic basement to the north.
Since the Second World War, preferential trading arrangements (PTAs) have become increasingly pervasive features of the international economic system. A great deal of research has addressed the economic consequences of these arrangements, but far less effort has been made to identify the political factors leading states to enter them. In this article, the domestic political factors affecting whether countries enter PTAs are investigated, placing particular emphasis on the number of veto players within a state. It is argued that the probability of forming a PTA declines as the number of such players rises. The results, covering 194 countries from 1950 to 1999, strongly support this argument. Holding various political and economic factors constant, increasing the number of veto players within a country significantly reduces the probability of signing a PTA.
Nutrigenomics is the study of how constituents of the diet interact with genes, and their products, to alter phenotype and, conversely, how genes and their products metabolise these constituents into nutrients, antinutrients, and bioactive compounds. Results from molecular and genetic epidemiological studies indicate that dietary unbalance can alter gene–nutrient interactions in ways that increase the risk of developing chronic disease. The interplay of human genetic variation and environmental factors will make identifying causative genes and nutrients a formidable, but not intractable, challenge. We provide specific recommendations for how to best meet this challenge and discuss the need for new methodologies and the use of comprehensive analyses of nutrient–genotype interactions involving large and diverse populations. The objective of the present paper is to stimulate discourse and collaboration among nutrigenomic researchers and stakeholders, a process that will lead to an increase in global health and wellness by reducing health disparities in developed and developing countries.
We present a case study of the use of simulation modelling to develop and test strategies for managing populations under uncertainty. Strategies that meet a stock conservation criterion under a base case scenario are subjected to a set of robustness trials, including biased and highly variable abundance estimates and poaching. Strategy performance is assessed with respect to a conservation criterion, the revenues achieved and their variability. Strategies that harvest heavily, even when the population is apparently very large, perform badly in the robustness trials. Setting a threshold below which harvesting does not take place, and above which all individuals are harvested, does not provide effective protection against over-harvesting. Strategies that rely on population growth rates rather than estimates of population size are more robust to biased estimates. The strategies that are most robust to uncertainty are simple, involving harvesting a relatively small proportion of the population each year. The simulation modelling approach to exploring harvesting strategies is suggested as a useful tool for the assessment of the performance of competing strategies under uncertainty.
This chapter reviews briefly the apparent responses to global events by the amphibians, reptiles and birds during the past 145 million years. The vast majority of living members of these groups are terrestrial or freshwater (aquatic or amphibious) dwellers. Some 4956 species of amphibians are represented by 28 families of Anura (frogs and toads), ten families of Caudata (newts and salamanders) and five families of Gymnophiona (caecilians) (Cogger & Zweifel, 1998). Over 7427 living species of reptiles are represented by 13 families of Testudinata (turtles and tortoises), 48 families of Lepidosauromorpha (comprising 26 lizard families, ca. 18 snake families, 4 amphisbaenid (worm lizard) families and a single relict rhynchocephalian genus, the Tuatara (Sphenodon)) and the Archosauria (Coggar & Zweifel, 1998). The last group includes the extinct Pterosauria and the wholly terrestrial Dinosauria, and is today represented by three families of Crocodylia (crocodiles and alligators) and by the birds, the most numerous terrestrial vertebrates with over 9700 species in more than 20 orders, more than 50% belonging to the Passeriformes (song birds) (Feduccia, 1996).
The geographical distribution patterns of these groups are intimately linked to the historical distribution of continental areas, and we can attempt to relate the patterns of their past and present distributions to the patterns produced by the global dynamics of plate tectonics. An understanding of the patterns of relationships within these (and other) groups of organisms is fundamental to attempts to interpret their vicariance and dispersal patterns.
A number of methods exist by which the pH of local anaesthetic solutions may be increased. Most commonly, these require the addition of differing amounts of sodium bicarbonate solution according to the local anaesthetic drugs. Sodium bicarbonate (1%) was titrated against pH in six commonly used local anaesthetic solutions. Titration curves of pH and volume of sodium bicarbonate solution added are shown for this group of local anaesthetics. This study demonstrates that 1 mL of 1% sodium bicarbonate solution may be used to alkalinize this range of local anaesthetics without the risk of precipitation. We also conclude that Ropivacaine (at concentration 0.75% and 1.0%), is unsuitable for alkalinization since it precipitates at a pH of 6.0.
Subsamples of termite mound soil used by chimpanzees for geophagy, and topsoil never ingested by them, from the forest floor in the Mahale Mountains National Park, Tanzania, were analysed to determine the possible stimulus or stimuli for geophagy. The ingested samples have a dominant clay texture equivalent to a claystone, whereas the control samples are predominantly sandy clay loam or sandy loam, which indicates that particle size plays a significant role in soil selection for this behaviour. One potential function of the clays is to bind and adsorb toxins. Although both termite mound and control samples have similar alkaloid-binding capacities, they are in every case very high, with the majority of the samples being above 80%. The clay size material (<2 μm) contains metahalloysite and halloysite, the latter a hydrated aluminosilicate (Al2Si2O4·nH2O), present in the majority of both the termite mound soil and control soil samples.Metahalloysite, one of the principal ingredients found in the pharmaceutical Kaopectate™, is used to treat minor gastric ailments in humans. The soils commonly ingested could also function as antacids, as over half had pH values between 7.2 and 8.6. The mean concentrations of the majority of elements measured were greater in the termite mound soils than in the control soils. The termite mound soils had more filamentous bacteria, whereas the control soils contained greater numbers of unicellular bacteria and fungi.
Tibial torsion, twisting of the tibia about its longitudinal
axis, varies during development and early
childhood. Knowledge of the normal range of tibial torsion at
various ages and its accurate clinical
measurement is important in the assessment of the extent of a
torsional deformity. To evaluate tibial torsion
a reliable technique for its measurement in vivo is therefore
required. The aim of this study was to
determine which of 4 existing in vivo methods of measuring tibial
torsion was the most accurate and had the
highest repeatability, by comparing them with direct measurement
of the tibia. A wide range of mean values
for tibial torsion was observed, using the various techniques,
with none of the indirect techniques employed
having a strong correlation with direct measurement of tibial
torsion. The repeatability of the indirect
techniques was observed to be low both in cadavers (n=4) and
the living (n=3). Since none of the in vivo
techniques appear to measure true tibial torsion or be of a
reasonable repeatability, alternative easy to use
and inexpensive methods need to be developed. Accurate clinical
measurement of tibial torsion is important
in the assessment of the extent of a torsional deformity. It is
recommended that data gained using the
methods reviewed here are interpreted with caution.