We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clay/electrolyte systems are best characterized using a new variable, notional interfacial content (NIC). The NIC approach emphasizes that any division of the total amount of a species present in a clay/electrolyte system into that part which ‘belongs’ to the clay and that part which ‘belongs’ to solution is necessarily arbitrary. It carries out this division by choosing one of the species as a reference species and defining it as being completely in a notional bulk solution which had the same composition, but not extent, as the real bulk solution. The NIC of each of the other species present, that is that part of their total amount not in the notional bulk solution, therefore represents the amounts of those species ‘belonging’ to the clay.
The NIC concept is universal and hence encompasses several other, older terms. For example, by choosing different reference species, the variables hitherto called ‘water adsorption’ and ‘negative adsorption’ (which have been used to describe the same phenomenon) may be obtained. Similarly, certain earlier definitions of exchangeable and adsorbed cations (some being pragmatic are not included) as well as that of ion surface excesses may be accounted for. The NIC approach thus provides a rationalization of several earlier terms which are, in fact, plagued by a multiplicity of definition which makes their use very complicated.
This paper introduces historical aspects of the concepts correspondence and coherence with emphasis on the nineteenth century when key aspects of modern science were emerging. It is not intended to be a definitive history of the concepts of correspondence and coherence as they have been used across the centuries in the field of inquiry that we now call science. Rather it is a brief history that highlights the apparent origins of the concepts and provides a discussion of how these concepts contributed to two important science related controversies. The first relates to aspects of evolution in which correspondence and coherence, as competing theories of truth, played a central role. The controversy about evolution continues into the beginning of the twenty-first century in forms that are recognizably similar to those of the middle of the nineteenth century. The second controversy relates to the etiology of blood-born infections (sepsis) during childbirth (childbed fever). In addition to correspondence and coherence, the authors introduce other theories of truth and discuss an evolutionarily cogent theory of truth, the pragmatic theory of truth.
Sequence-based association studies are at a critical inflexion point with the increasing availability of exome-sequencing data. A popular test of association is the sequence kernel association test (SKAT). Weights are embedded within SKAT to reflect the hypothesized contribution of the variants to the trait variance. Because the true weights are generally unknown, and so are subject to misspecification, we examined the efficiency of a data-driven weighting scheme. We propose the use of a set of theoretically defensible weighting schemes, of which, we assume, the one that gives the largest test statistic is likely to capture best the allele frequency–functional effect relationship. We show that the use of alternative weights obviates the need to impose arbitrary frequency thresholds. As both the score test and the likelihood ratio test (LRT) may be used in this context, and may differ in power, we characterize the behavior of both tests. The two tests have equal power, if the weights in the set included weights resembling the correct ones. However, if the weights are badly specified, the LRT shows superior power (due to its robustness to misspecification). With this data-driven weighting procedure the LRT detected significant signal in genes located in regions already confirmed as associated with schizophrenia — the PRRC2A (p = 1.020e-06) and the VARS2 (p = 2.383e-06) — in the Swedish schizophrenia case-control cohort of 11,040 individuals with exome-sequencing data. The score test is currently preferred for its computational efficiency and power. Indeed, assuming correct specification, in some circumstances, the score test is the most powerful test. However, LRT has the advantageous properties of being generally more robust and more powerful under weight misspecification. This is an important result given that, arguably, misspecified models are likely to be the rule rather than the exception in weighting-based approaches.
The genetic basis of weedy and invasive traits and their evolution remain poorly understood, but genomic approaches offer tremendous promise for elucidating these important features of weed biology. However, the genomic tools and resources available for weed research are currently meager compared with those available for many crops. Because genomic methodologies are becoming increasingly accessible and less expensive, the time is ripe for weed scientists to incorporate these methods into their research programs. One example is next-generation sequencing technology, which has the advantage of enhancing the sequencing output from the transcriptome of a weedy plant at a reduced cost. Successful implementation of these approaches will require collaborative efforts that focus resources on common goals and bring together expertise in weed science, molecular biology, plant physiology, and bioinformatics. We outline how these large-scale genomic programs can aid both our understanding of the biology of weedy and invasive plants and our success at managing these species in agriculture. The judicious selection of species for developing weed genomics programs is needed, and we offer up choices, but no Arabidopsis-like model species exists in the world of weeds. We outline the roadmap for creating a powerful synergy of weed science and genomics, given well-placed effort and resources.
Using univariate sum scores in genetic studies of twin data is common practice. This practice precludes an investigation of the measurement model relating the individual items to an underlying factor. Absence of measurement invariance across a grouping variable such as gender or environmental exposure refers to group differences with respect to the measurement model. It is shown that a decomposition of a sum score into genetic and environmental variance components leads to path coefficients of the additive genetic factor that are biased differentially across groups if individual items are non-invariant. The arising group differences in path coefficients are identical to what is known as “scalar sex limitation” when gender is the grouping variable, or as “gene by environment interaction” when environmental exposure is the grouping variable. In both cases the interpretation would be in terms of a group-specific effect size of the genetic factor. This interpretation may be incorrect if individual items are non-invariant.
Twin studies of complex traits, such as behavior or psychiatric diagnoses, frequently involve univariate analysis of a sum score derived from multiple items. In this article, we show that absence of measurement invariance across zygosity can bias estimates of genetic and environmental components of variance. Specifically, if the item responses are considered as multiple indicators of a latent factor, and the aim is to partition the variance in the latent factor, then the factor loadings relating the items to the factor should be equal for monozygotic (MZ) and dizygotic (DZ) twins. While it seems unlikely, a priori, that these loadings should differ as a function of zygosity, certain special measurement situations are cause for concern. Ratings by parents, or self-ratings of pheno- types which are more easily observed in others than via introspection, may be tainted by the co-twin's phenotype to a greater extent in MZ than DZ pairs. We also show that the analysis of sum scores typically biases both MZ and DZ correlations compared to the true latent trait correlation. These two sources of bias are quantified for a range of values and are shown to be especially acute for sum scores based on binary items. Solutions to these problems include formal tests for measurement invariance across zygosity prior to analysis of the sum or scale scores, and multivariate genetic analysis at the individual item or symptom level.
This article is concerned with the power to detect the presence of genotype by environment interaction (G × E) in the case that both genes and environment feature as latent (i.e., unmeasured) variables. The power of the test proposed by Jinks and Fulker (1970), which is based on regressing the absolute difference between the scores of monozygotic twins on the sums of these scores, is compared to the power of an alternative test, which is based on Marginal Maximum Likelihood (MML). Simulation studies showed that generally the power of the MML-based test was greater than the power of the Jinks and Fulker test in detecting linear and curvilinear G × E interaction, regardless of whether the distribution of the data deviated significantly from normality. However, after a normalizing transformation, the Jinks and Fulker test performed slightly better. Some possible future extensions of the MML-based test are briefly discussed.
The introduction of screening for hepatitis C virus (HCV) by the National Blood Transfusion Service identified donors who had acquired HCV infection. We undertook a case-control study amongst blood donors in the Trent Region to determine risks for HCV infection. A total of 74 blood donors confirmed positive for hepatitis C infection and 150 age, sex and donor venue matched controls were included in the study. Fifty-three percent of hepatitis C infected blood donors reported previous use of injected drugs compared to no controls; relative risk (RR) not estimatable (lower limit 95% CI = 20). Other risk factors were a history of: receipt of a blood transfusion or blood products RR = 3·6 (95% CI 1·5–8·3), having been a ‘health care worker’ RR = 2·8 (95% CI 1·1–7·6), tattooing RR = 3·3 (95% CI 1·2–8·7), and an association with having been born abroad RR = 3·2 (95% CI 1·1–9·5). No risk was shown for a history of multiple sexual partners, ear piercing or acupuncture. Injecting drug use explains more than 50% of hepatitis C infections in blood donors, a group who are less likely to have injected drugs than the general population.
Experiments are described which study the effect of density and chop length on the rate of diffusion of oxygen into silage and equations are presented which enable the rate of diffusion to be calculated under laboratory conditions. The concept of zero porosity is also discussed and methods of calculating the density at which it occurs are given. The effect of carbon dioxide on the rate of diffusion is also discussed.
The aim of this study was to compare the incidence of community-acquired Legionnaires' Disease in Nottingham with England and Wales and to explore reasons for any difference observed. Based on data from the National Surveillance Scheme for Legionnaires' Disease (1980–1999), the rate of infection in England and Wales was 1·3 per million/year compared with 6·6 per million/year in Nottingham. Domestic water samples were obtained from 41 (95%) of 43 Nottingham cases between 1997 and 2000. In 16 (39%) cases, Legionella sp. were cultured in significant quantities. Proximity to a cooling tower was examined using a 1[ratio ]4 case-controlled analysis. No significant difference in the mean distance between place of residence to the nearest cooling tower was noted (cases 2·7 km vs. controls 2·3 km; P=0·5). These data suggest that Nottingham does have a higher rate of legionella infection compared to national figures and that home water systems are a source.
The effects of adding various chemical surfactants to the prepolymer syrup on the electro-optical switching properties of Bragg gratings recorded in polymer dispersed liquid crystals (PDLC's) have been studied. The gratings were holographically recorded in prepolymer recipes substituting hexanoic, heptanoic, propylpentanoic and octanoic acids as surfactants in the recipe. A small percentage of monomer containing an attached long chain alkyl group (vinyl neononanoate) was also used instead of a surfactant. The addition of surfactants resulted in lowering the required switching field from 17 V/μm with no surfactant to 1.5 to 8 V/μm at a concentration of about 6.7% by weight, depending on the surfactant. Field on response times for electrical switching decreased with the addition of surfactant, and off times increased.
An archaeological evaluation and excavation were carried out prior to a housing development in 1992, at Bramcote Green, in the London Borough of Southwark. Up to 3 m of organic rich, alluvial clay silts were deposited during the late Glacial period between about 12,000 BP and 9000 BP. A wide, shallow channel flowing south towards the Thames cut through the clay silts during the early Holocene and was filled with a series of clay and peat layers. Between 6000 BP and 4000 BP fast moving water channels formed on the marshy ground on the east side of the site and broader channels on the sand and gravel outcrop on the west side of the site. A subsequent rise in water levels, possibly seasonal, deposited inorganic muds across most of the site until c. 3500 BP. Over the filled-in channel were laid two phases of a wooden trackway which may have been laid across the marsh between high ground to the south and Bermondsey Island to the north. The earlier trackway consisted of two parallel lines of alder logs held in place by alder stakes. The second consisted of a single line of oak logs with alder stakes along one side. Radiocarbon dating of the second trackway places it in the middle of the 2nd millennium BC. The site was covered by a thick layer of peat dated to the Late Bronze Age.
Logit analysis is used to evaluate the performance of the zoning body in Frederick County, Maryland, in terms of its statutory policy objectives. Models are formulated to test the hypothesis that the rezoning process is consistent with the guidelines specified in the County Zoning Ordinance. An analysis of 59 requests for rezoning from agricultural to other uses indicates that, in general, both the Planning Commission staff and the County Commissioners conform to the Ordinance. At the Commissioners’ level, a development bias in favor of industrial use was found. The methodology may also be used to forecast the probability a particular rezoning request would be approved.