To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Equal parameter estimates across subgroups is a substantial requirement of statistical tests. Ignoring subgroup differences poses a threat to study replicability, model specification, and theory development. Structural change tests are a powerful statistical technique to assess parameter invariance. A core element of those tests is the empirical fluctuation process. In the case of parameter invariance, the fluctuation process asymptotically follows a Brownian bridge. This asymptotic assumption further provides the basis for inference. However, the empirical fluctuation process does not follow a Brownian bridge in small samples, and this situation is amplified in large psychometric models. Therefore, common methods of obtaining the sampling distribution are invalid and the structural change test becomes conservative. We discuss an alternative solution to obtaining the sampling distribution—permutation approaches. Permutation approaches estimate the sampling distribution through resampling of the dataset, avoiding distributional assumptions. Hereby, the tests power are improved. We conclude that the permutation alternative is superior to standard asymptotic approximations of the sampling distribution.
For many years, historical accounts of Australian Federation ignored the distinctive ideological origins of the Australian Constitution. From the mid 1980s until the 2000s, however, a generation of historians remembered how the Australian drafters built a distinctive constitutional democracy that combined trust in parliament with a direct constitutional role for a plural ‘people’: the people of Australia and the people of the states. Drawing on Chartist and American ideas of popular sovereignty, this system of popular political constitutionalism textually guarantees that ‘the people’ can ‘directly’ choose both houses of parliament, break deadlocks between these houses and make constitutional law. The definition of ‘the people’ in this distinctive form of constitutional democracy was, however, racially exclusive. In particular, First Australians were excluded from the plural people of Australia.
This intellectual history of the Australian Constitution, however, has had remarkably little impact on constitutional interpretation and discourse. This paper will begin the process of examining those implications. First, it will show how this history provides important contextual support and direction to the implied limitations on parliamentary power that stem from the constitution’s guarantee of representative democracy in sections 7 and 24 of the Constitution. Second, it will demonstrate how it aids in better understanding Australia’s unique constitutional system. To date, this system has remedied the racist roots of the original constitutional definition of the people largely through legislative reform. The constitutional recognition of First Australians is a critical step in acknowledging that First Australians are a distinct part of the plural Australian people. In the aftermath of the failure of the First Nations Voice to Parliament proposal, meaningful constitutional recognition for First Australians must address their structural exclusion from the plural Australian people.
Experts argue that resource transfers from developed to developing countries are central to international climate policy efforts. Yet as countries grapple with the political difficulties of provisioning and accepting climate funds, understanding why voters support or oppose international climate finance becomes critical. Focusing on domestic audiences in both donor and recipient countries, we investigate the determinants of public support for cross-border climate transfers. Theoretically, we focus on the effects of emphasizing the compensatory purposes of funding, favoring mitigation over adaptation activities, and prioritizing partnerships between donor and recipient agents—three factors that generate both normative and material benefits, and thus build support among broader coalitions of voters. Paired survey experiments in the United States and India corroborate the relevance of these transfer features for citizens in donor and recipient countries. Taken together, our findings shed light on the domestic political-economy attributes of transfer agreements that can unlock support for cross-border climate cooperation.
DNA unzipping by nanopore translocation has implications in diverse contexts, from polymer physics to single-molecule manipulation to DNA–enzyme interactions in biological systems. Here we use molecular dynamics simulations and a coarse-grained model of DNA to address the nanopore unzipping of DNA filaments that are knotted. This previously unaddressed problem is motivated by the fact that DNA knots inevitably occur in isolated equilibrated filaments and in vivo. We study how different types of tight knots in the DNA segment just outside the pore impact unzipping at different driving forces. We establish three main results. First, knots do not significantly affect the unzipping process at low forces. However, knotted DNAs unzip more slowly and heterogeneously than unknotted ones at high forces. Finally, we observe that the microscopic origin of the hindrance typically involves two concurrent causes: the topological friction of the DNA chain sliding along its knotted contour and the additional friction originating from the entanglement with the newly unzipped DNA. The results reveal a previously unsuspected complexity of the interplay of DNA topology and unzipping, which should be relevant for interpreting nanopore-based single-molecule unzipping experiments and improving the modeling of DNA transactions in vivo.
Some economic interactions are based on trust, others on monetary incentives or monitoring. In the tax compliance context, the monitoring approach creates compliance based on audits and fines (enforced compliance), in contrast to the trust-based (voluntary compliance) approach, which is based on taxpayers’ willingness to comply. Here, we examine how changes in taxation regarding platform economy revenues affect intended labor supply on such platforms. New EU legislation, effective from 2023, will mandate data sharing between platforms and tax authorities across Europe, thus resulting in increased monitoring. We investigate how this upcoming shift in monitoring power affects the intended use of platforms and how it may interact with users’ trust. We use a survey among platform workers (N = 626) in the Netherlands to examine views of the proposed regulation change, corrected for the proportion of platform income and several measures of trust. We experimentally manipulate information by either informing participants about the upcoming monitoring change or not. Results show that informing respondents about the change negatively affects expected supply of labor, and this effect is independent of respondents’ trust. We discuss the policy implications of these results.
Human mitochondrial Complex I is one of the largest multi-subunit membrane protein megacomplexes, which plays a critical role in oxidative phosphorylation and ATP production. It is also involved in many neurodegenerative diseases. However, studying its structure and the mechanisms underlying proton translocation remains challenging due to the hydrophobic nature of its transmembrane parts. In this structural bioinformatic study, we used the QTY code to reduce the hydrophobicity of megacomplex I, while preserving its structure and function. We carried out the structural bioinformatics analysis of 20 key enzymes in the integral membrane parts. We compare their native structure, experimentally determined using Cryo-electron microscopy (CryoEM), with their water-soluble QTY analogs predicted using AlphaFold 3. Leveraging AlphaFold 3’s advanced capabilities in predicting protein–protein complex interactions, we further explore whether the QTY-code integral membrane proteins maintain their protein–protein interactions necessary to form the functional megacomplex. Our structural bioinformatics analysis not only demonstrates the feasibility of engineering water-soluble integral membrane proteins using the QTY code, but also highlights the potential to use the water-soluble membrane protein QTY analogs as soluble antigens for discovery of therapeutic monoclonal antibodies, thus offering promising implications for the treatment of various neurodegenerative diseases.
Foreign interference is a growing threat to all liberal democracies, including Australia. To respond to this growing threat, the Department of Home Affairs has developed a complex ‘Counter-Foreign Interference Strategy’ (CFIS). At the heart of the strategy lies a suite of interlocking and overlapping legislation, including the Foreign Influence Transparency Scheme Act 2018 (FITS Act), the National Security Legislation Amendment (Espionage and Foreign Interference) Act 2018 and the Electoral Legislation Amendment (Electoral Funding and Disclosure Reform) Act 2018 (Electoral Funding Act). The aim of this paper is to explain and clarify the legislation and the free speech burdens it imposes, and determine whether the laws are suitably targeted at foreign interference without unduly limiting legitimate communication activity. We argue that the current criminal law regime is ineffective in addressing the problem because foreign interference is a complex and pervasive phenomenon taking many different forms — from espionage on university campuses to anonymous and targeted social media campaigns. The legislative scheme is not properly tailored to tackle foreign interference as it actually occurs.
Given a squared Euclidean norm penalty, we examine some less well-known properties of shrinkage estimates. In particular, we highlight that it is possible for some components of the shrinkage estimator to be placed further away from the prior mean than the original estimate. An analysis of this effect is provided within three different modeling settings—encompassing linear, logistic, and ordinal regression models. Additional simulations show that the outlined effect is not a mathematical artefact, but likely to occur in practice. As a byproduct, they also highlight the possibilities of sign reversals (“overshoots”) for shrinkage estimates. We point out practical consequences and challenges, which might arise from the observed effects with special emphasis on psychometrics.
This paper investigates the precision of parameters estimated from local samples of time dependent functions. We find that time delay embedding, i.e., structuring data prior to analysis by constructing a data matrix of overlapping samples, increases the precision of parameter estimates and in turn statistical power compared to standard independent rows of panel data. We show that the reason for this effect is that the sign of estimation bias depends on the position of a misplaced data point if there is no a priori knowledge about initial conditions of the time dependent function. Hence, we reason that the advantage of time delayed embedding is likely to hold true for a wide variety of functions. We support these conclusions both by mathematical analysis and two simulations.
The Wald, likelihood ratio, score, and the recently proposed gradient statistics can be used to assess a broad range of hypotheses in item response theory models, for instance, to check the overall model fit or to detect differential item functioning. We introduce new methods for power analysis and sample size planning that can be applied when marginal maximum likelihood estimation is used. This allows the application to a variety of IRT models, which are commonly used in practice, e.g., in large-scale educational assessments. An analytical method utilizes the asymptotic distributions of the statistics under alternative hypotheses. We also provide a sampling-based approach for applications where the analytical approach is computationally infeasible. This can be the case with 20 or more items, since the computational load increases exponentially with the number of items. We performed extensive simulation studies in three practically relevant settings, i.e., testing a Rasch model against a 2PL model, testing for differential item functioning, and testing a partial credit model against a generalized partial credit model. The observed distributions of the test statistics and the power of the tests agreed well with the predictions by the proposed methods in sufficiently large samples. We provide an openly accessible R package that implements the methods for user-supplied hypotheses.
Hierarchical models are often considered to measure latent concepts defining nested sets of manifest variables. Therefore, by supposing a hierarchical relationship among manifest variables, the general latent concept can be represented by a tree structure where each internal node represents a specific order of abstraction for the latent concept measured. In this paper, we propose a new latent factor model called second-order disjoint factor analysis in order to model an unknown hierarchical structure of the manifest variables with two orders. This is a second-order factor analysis, which—respect to the second-order confirmatory factor analysis—is exploratory, nested and estimated simultaneously by maximum likelihood method. Each subset of manifest variables is modeled to be internally consistent and reliable, that is, manifest variables related to a factor measure “consistently” a unique theoretical construct. This feature implies that manifest variables are positively correlated with the related factor and, therefore, the associated factor loadings are constrained to be nonnegative. A cyclic block coordinate descent algorithm is proposed to maximize the likelihood. We present a simulation study that investigates the ability to get reliable factors. Furthermore, the new model is applied to identify the underlying factors of well-being showing the characteristics of the new methodology. A final discussion completes the paper.
Human abilities in perceptual domains have conventionally been described with reference to a threshold that may be defined as the maximum amount of stimulation which leads to baseline performance. Traditional psychometric links, such as the probit, logit, and t, are incompatible with a threshold as there are no true scores corresponding to baseline performance. We introduce a truncated probit link for modeling thresholds and develop a two-parameter IRT model based on this link. The model is Bayesian and analysis is performed with MCMC sampling. Through simulation, we show that the model provides for accurate measurement of performance with thresholds. The model is applied to a digit-classification experiment in which digits are briefly flashed and then subsequently masked. Using parameter estimates from the model, individuals’ thresholds for flashed-digit discrimination is estimated.
Egg masses of Aplysia depilans consist of long and intertwined strings containing numerous capsules with eggs. Light microscopy stains and transmission electron microscopy revealed four layers in the gelatinous sheath that encircled and aggregated the chain of egg capsules. The outermost layer has a fluffy structure. The second, third, and fourth layers consisted of reticulated matrices with different densities. The second and third layers were divided into 5‒6 strata each. The fourth and innermost layer of the gelatinous sheath has a higher density and no visible stratification. This layer glues the tightly packed capsules to one another and to the outer layers of the gelatinous sheath. The thin wall of the capsules is formed by a homogeneous and highly electron-dense material. Inside the capsules, the eggs or embryos were bathed in an electron-lucent aqueous medium. Bacteria and diatoms were the most abundant microorganisms on the surface of egg strings. Bacteria penetrate the gelatinous sheath and appear to be involved in the degradation of the upper strata, but were never found inside the egg capsules. Metagenomic analysis revealed a large taxonomic diversity of bacteria associated with egg masses of A. depilans. Although 15 phyla could be recognized, the families Flavobacteriaceae (Bacteroidota), Lentisphaeraceae (Lentisphaerota), and Rhodobacteraceae (Pseudomonadota) represented 67.9% ± 11.6% of the relative abundance in the microbiome of the egg string samples. The presence of genera capable of decomposing polysaccharides, such as Tenacibaculum and Cellulophaga, supports the idea that bacteria are responsible for the degradation of the gelatinous layers of the egg strings.
A Metropolis–Hastings Robbins–Monro (MH-RM) algorithm for high-dimensional maximum marginal likelihood exploratory item factor analysis is proposed. The sequence of estimates from the MH-RM algorithm converges with probability one to the maximum likelihood solution. Details on the computer implementation of this algorithm are provided. The accuracy of the proposed algorithm is demonstrated with simulations. As an illustration, the proposed algorithm is applied to explore the factor structure underlying a new quality of life scale for children. It is shown that when the dimensionality is high, MH-RM has advantages over existing methods such as numerical quadrature based EM algorithm. Extensions of the algorithm to other modeling frameworks are discussed.
Mark Noll recognized that “the most comprehensive defense of female activity in public life came from Sarah Grimké.” Claudia Setzer lauded Grimké’s Letters, as “the first sustained analysis of women’s rights stemming from biblical and theological argument to be written by an American.” Scholars have studied her use of the Bible, including her critique of translations, but none has detailed Grimké’s use of the influential whole-volume commentaries of Matthew Henry, Thomas Scott, and Adam Clarke. This article documents her citations, critiques, and editing of those commentaries through selection, interruption, omission, and paraphrase. It focuses upon her thirteenth and fourteenth letters, in which Grimké interpreted Acts 2:1–4, 1 Cor 11:4–5 and 14:34–35, and 1 Tim 2:8–12. By studying her critical engagement with commentaries, we demonstrate the veracity of Grimké’s contention that women “shall produce some various readings of the Bible a little different from those we now have.”