To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This manuscript reports structure–function studies of Catechol 2,3-dioxygenase (C23O64), which is the second enzyme in the metabolic degradation pathway of 3-nitrotoluene by Diaphorobacter sp. strain DS2. The recombinant protein is a ring cleavage enzyme for 3-methylcatechol and 4-methylcatechol products formed after dioxygenation of the aromatic ring. Here we report the substrate-free, substrate-bound, and substrate-analog bound crystal structures of C23O64. The protein crystallizes in the P6(2)22 space-group. The structures were determined by molecular replacement and refined to resolutions of 2.4, 2.4, 2.2 Å, respectively. A comparison of the structures with related extradiol dioxygenases showed 22 conserved residues. A comparison of the active site pocket with catechol 2,3-dioxygenase (LapB) from Pseudomonas sp KL28 and homoprotocatechuate 2,3-dioxygenase (HPCD) from Brevibacterium fuscum shows significant similarities to suggest that the mechanism of enzyme action is similar to HPCD.
Industry 4.0 represents high-level methodologies for the development of new generation manufacturing metrology systems, which are more intelligent (smart), autonomous, flexible, high-productive, and self-adaptable. One of the systems capable of responding to these challenges is a cyber-physical manufacturing metrology system (CP2MS) with techniques of artificial intelligence (AI). In general, CP2MS systems generate Big data, horizontally by integration [coordinate measuring machines (CMMs)] and vertically by control. This paper presents a cyber-physical manufacturing metrology model (CP3M) for Industry 4.0 developed by applying AI techniques such as engineering ontology (EO), ant-colony optimization (ACO), and genetic algorithms (GAs). Particularly, the CP3M presents an intelligent approach of probe configuration and setup planning for inspection of prismatic measurement parts (PMPs) on a CMM. A set of possible PMP setups and probe configurations is reduced to optimal number using developed GA-based methodology. The major novelty is the development of a new CP3M capable of responding to the requirements of an Industry 4.0 concept such as intelligent, autonomous, and productive measuring systems. As such, they respond to one smart metrology requirement within the framework of Industry 4.0, referring to the optimal number of PMPs setups and for each setup defines the configurations of probes. The main contribution of the model is productivity increase of the measuring process through the reduction of the total measurement time, as well as the elimination of errors due to the human factor through intelligent planning of probe configuration and part setup. The experiment was successfully performed using a PMP specially designed and manufactured for the purpose.
The potential relationship between service demands and remanufacturing services (RMS) is essential to make the decision of a RMS plan accurately and improve the efficiency and benefit. In the traditional association rule mining methods, a large number of candidate sets affect the mining efficiency, and the results are not easy for customers to understand. Therefore, a mining method based on binary particle swarm optimization ant colony algorithm to discover service demands and remanufacture services association rules is proposed. This method preprocesses the RMS records, converts them into a binary matrix, and uses the improved ant colony algorithm to mine the maximum frequent itemset. Because the particle swarm algorithm determines the initial pheromone concentration of the ant colony, it avoids the blindness of the ant colony, effectively enhances the searchability of the algorithm, and makes association rule mining faster and more accurate. Finally, a set of historical RMS record data of straightening machine is used to test the validity and feasibility of this method by extracting valid association rules to guide the design of RMS scheme for straightening machine parts.
Any modality in homotopy type theory gives rise to an orthogonal factorization system of which the left class is stable under pullbacks. We show that there is a second orthogonal factorization system associated with any modality, of which the left class is the class of ○-equivalences and the right class is the class of ○-étale maps. This factorization system is called the modal reflective factorization system of a modality, and we give a precise characterization of the orthogonal factorization systems that arise as the modal reflective factorization system of a modality. In the special case of the n-truncation, the modal reflective factorization system has a simple description: we show that the n-étale maps are the maps that are right orthogonal to the map $${\rm{1}} \to {\rm{ }}{{\rm{S}}^{n + 1}}$$. We use the ○-étale maps to prove a modal descent theorem: a map with modal fibers into ○X is the same thing as a ○-étale map into a type X. We conclude with an application to real-cohesive homotopy type theory and remark how ○-étale maps relate to the formally etale maps from algebraic geometry.
A k-permutation family on n vertices is a set-system consisting of the intervals of k permutations of the integers 1 to n. The discrepancy of a set-system is the minimum over all red–blue vertex colourings of the maximum difference between the number of red and blue vertices in any set in the system. In 2011, Newman and Nikolov disproved a conjecture of Beck that the discrepancy of any 3-permutation family is at most a constant independent of n. Here we give a simpler proof that Newman and Nikolov’s sequence of 3-permutation families has discrepancy $\Omega (\log \,n)$. We also exhibit a sequence of 6-permutation families with root-mean-squared discrepancy $\Omega (\sqrt {\log \,n} )$; that is, in any red–blue vertex colouring, the square root of the expected squared difference between the number of red and blue vertices in an interval of the system is $\Omega (\sqrt {\log \,n} )$.
Understanding individual differences in attitudes to autism is crucial for improving attitudes and reducing stigma towards autistic people, yet there is limited and inconsistent research on this topic. This is compounded by a lack of appropriate measures and multivariate analyses. Addressing these issues, using up-to-date measures and multiple linear regression, we examined the relative contributions of participant age, sex, autism knowledge, level of contact with autistic people, and autistic traits to attitudes towards autistic people. We found that greater autism knowledge and higher levels of contact, but no other variables, were uniquely predictive of attitudes towards autistic people. We conclude that, in addition to public awareness campaigns to raise knowledge of autism, it may be important to increase contact between autistic and non-autistic people to improve public attitudes towards autistic people.
Organo-modified clay nanoparticles were mixed at 1 and 5 wt% concentrations with a molten blend of 75 wt% of polylactide (PLA) and 25 wt% poly[(butylene adipate)-co-terephthalate] (PBAT). Three mixing strategies were used to control the localization of nanoclay. Small amplitude oscillatory shear (SAOS) and stress growth tests were conducted to clarify the nanoclay interactions with the blend components and its effect on the molecular relaxation behavior. SAOS and weighted relaxation spectra properties were determined before and after pre-shearing at a rate of 0.01 s−1. Molecular relaxation and its characteristics were influenced by PLA degradation, PBAT droplet coalescence, and nanoclay localization.
The massive volume of data generated in modern applications can overwhelm our ability to conveniently transmit, store, and index it. For many scenarios, building a compact summary of a dataset that is vastly smaller enables flexibility and efficiency in a range of queries over the data, in exchange for some approximation. This comprehensive introduction to data summarization, aimed at practitioners and students, showcases the algorithms, their behavior, and the mathematical underpinnings of their operation. The coverage starts with simple sums and approximate counts, building to more advanced probabilistic structures such as the Bloom Filter, distinct value summaries, sketches, and quantile summaries. Summaries are described for specific types of data, such as geometric data, graphs, and vectors and matrices. The authors offer detailed descriptions of and pseudocode for key algorithms that have been incorporated in systems from companies such as Google, Apple, Microsoft, Netflix and Twitter.
This paper explores the analysis of ability, where ability is to be understood in the epistemic sense—in contrast to what might be called a causal sense. There are plenty of cases where an agent is able to perform an action that guarantees a given result even though she does not know which of her actions guarantees that result. Such an agent possesses the causal ability but lacks the epistemic ability. The standard analysis of such epistemic abilities relies on the notion of action types—as opposed to action tokens—and then posits that an agent has the epistemic ability to do something if and only if there is an action type available to her that she knows guarantees it. We show that these action types are not needed: we present a formalism without action types that can simulate analyzes of epistemic ability that rely on action types. Our formalism is a standard epistemic extension of the theory of “seeing to it that”, which arose from a modal tradition in the logic of action.
Population analyses of functional connectivity have provided a rich understanding of how brain function differs across time, individual, and cognitive task. An important but challenging task in such population analyses is the identification of reliable features that describe the function of the brain, while accounting for individual heterogeneity. Our work is motivated by two particularly important challenges in this area: first, how can one analyze functional connectivity data over populations of individuals, and second, how can one use these analyses to infer group similarities and differences. Motivated by these challenges, we model population connectivity data as a multilayer network and develop the multi-node2vec algorithm, an efficient and scalable embedding method that automatically learns continuous node feature representations from multilayer networks. We use multi-node2vec to analyze resting state fMRI scans over a group of 74 healthy individuals and 60 patients with schizophrenia. We demonstrate how multilayer network embeddings can be used to visualize, cluster, and classify functional regions of the brain for these individuals. We furthermore compare the multilayer network embeddings of the two groups. We identify significant differences between the groups in the default mode network and salience network—findings that are supported by the triple network model theory of cognitive organization. Our findings reveal that multi-node2vec is a powerful and reliable method for analyzing multilayer networks. Data and publicly available code are available at https://github.com/jdwilson4/multi-node2vec.
Supra-Bayesianism is the Bayesian response to learning the opinions of others. Probability pooling constitutes an alternative response. One natural question is whether there are cases where probability pooling gives the supra-Bayesian result. This has been called the problem of Bayes-compatibility for pooling functions. It is known that in a common prior setting, under standard assumptions, linear pooling cannot be nontrivially Bayes-compatible. We show by contrast that geometric pooling can be nontrivially Bayes-compatible. Indeed, we show that, under certain assumptions, geometric and Bayes-compatible pooling are equivalent. Granting supra-Bayesianism its usual normative status, one upshot of our study is thus that, in a certain class of epistemic contexts, geometric pooling enjoys a normative advantage over linear pooling as a social learning mechanism. We discuss the philosophical ramifications of this advantage, which we show to be robust to variations in our statement of the Bayes-compatibility problem.
This paper explores relational syllogistic logics, a family of logical systems related to reasoning about relations in extensions of the classical syllogistic. These are all decidable logical systems. We prove completeness theorems and complexity results for a natural subfamily of relational syllogistic logics, parametrized by constructors for terms and for sentences.
In this article, I provide Urquhart-style semilattice semantics for three connexive logics in an implication-negation language (I call these “pure theories of connexive implication”). The systems semantically characterized include the implication-negation fragment of a connexive logic of Wansing, a relevant connexive logic recently developed proof-theoretically by Francez, and an intermediate system that is novel to this article. Simple proofs of soundness and completeness are given and the semantics is used to establish various facts about the systems (e.g., that two of the systems have the variable sharing property). I emphasize the intuitive content of the semantics and discuss how natural informational considerations underly each of the examined systems.
A vexing question in Bayesian epistemology is how an agent should update on evidence which she assigned zero prior credence. Some theorists have suggested that, in such cases, the agent should update by Kolmogorov conditionalization, a norm based on Kolmogorov’s theory of regular conditional distributions. However, it turns out that in some situations, a Kolmogorov conditionalizer will plan to always assign a posterior credence of zero to the evidence she learns. Intuitively, such a plan is irrational and easily Dutch bookable. In this paper, we propose a revised norm, Kolmogorov–Blackwell conditionalization, which avoids this problem. We prove a Dutch book theorem and converse Dutch book theorem for this revised norm, and relate our results to those of Rescorla (2018).
In his Tractatus, Wittgenstein maintained that arithmetic consists of equations arrived at by the practice of calculating outcomes of operations $\Omega ^{n}(\bar {\xi })$ defined with the help of numeral exponents. Since $Num$(x) and quantification over numbers seem ill-formed, Ramsey wrote that the approach is faced with “insuperable difficulties.” This paper takes Wittgenstein to have assumed that his audience would have an understanding of the implicit general rules governing his operations. By employing the Tractarian logicist interpretation that the N-operator $N(\bar {\xi })$ and recursively defined arithmetic operators $\Omega ^{n}(\bar {\xi })$ are not different in kind, we can address Ramsey’s problem. Moreover, we can take important steps toward better understanding how Wittgenstein might have imagined emulating proof by mathematical induction.
This paper puts forward a new account of rigorous mathematical proof and its epistemology. One novel feature is a focus on how the skill of reading and writing valid proofs is learnt, as a way of understanding what validity itself amounts to. The account is used to address two current questions in the literature: that of how mathematicians are so good at resolving disputes about validity, and that of whether rigorous proofs are necessarily formalizable.