To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Given a graph $F$, we consider the problem of determining the densest possible pseudorandom graph that contains no copy of $F$. We provide an embedding procedure that improves a general result of Conlon, Fox, and Zhao which gives an upper bound on the density. In particular, our result implies that optimally pseudorandom graphs with density greater than $n^{-1/3}$ must contain a copy of the Peterson graph, while the previous best result gives the bound $n^{-1/4}$. Moreover, we conjecture that the exponent $1/3$ in our bound is tight. We also construct the densest known pseudorandom $K_{2,3}$-free graphs that are also triangle-free. Finally, we give a different proof for the densest known construction of clique-free pseudorandom graphs due to Bishnoi, Ihringer, and Pepe that they have no large clique.
This paper discusses the challenges and opportunities in accessing data to improve workplace relations law enforcement, with reference to minimum employment standards such as wages and working hours regulation. Our paper highlights some innovative examples of government and trade union efforts to collect and use data to improve the detection of noncompliance. These examples reveal the potential of data science as a compliance tool but also suggest the importance of realizing a data ecosystem that is capable of being utilized by machine learning applications. The effectiveness of using data and data science tools to improve workplace law enforcement is impacted by the ability of regulatory actors to access useful data they do not collect or hold themselves. Under “open data” principles, government data is increasingly made available to the public so that it can be combined with nongovernment data to generate value. Through mapping and analysis of the Australian workplace relations data ecosystem, we show that data availability relevant to workplace law compliance falls well short of open data principles. However, we argue that with the right protocols in place, improved data collection and sharing will assist regulatory actors in the effective enforcement of workplace laws.
We investigate here the behaviour of a large typical meandric system, proving a central limit theorem for the number of components of a given shape. Our main tool is a theorem of Gao and Wormald that allows us to deduce a central limit theorem from the asymptotics of large moments of our quantities of interest.
When people are asked to recall their social networks, theoretical and empirical work tells us that they rely on shortcuts, or heuristics. Cognitive social structures (CSSs) are multilayer social networks where each layer corresponds to an individual’s perception of the network. With multiple perceptions of the same network, CSSs contain rich information about how these heuristics manifest, motivating the question, Can we identify people who share the same heuristics? In this work, we propose a method for identifying cognitive structure across multiple network perceptions, analogous to how community detection aims to identify social structure in a network. To simultaneously model the joint latent social and cognitive structure, we study CSSs as three-dimensional tensors, employing low-rank nonnegative Tucker decompositions (NNTuck) to approximate the CSS—a procedure closely related to estimating a multilayer stochastic block model (SBM) from such data. We propose the resulting latent cognitive space as an operationalization of the sociological theory of social cognition by identifying individuals who share relational schema. In addition to modeling cognitively independent, dependent, and redundant networks, we propose a specific model instance and related statistical test for testing when there is social-cognitive agreement in a network: when the social and cognitive structures are equivalent. We use our approach to analyze four different CSSs and give insights into the latent cognitive structures of those networks.
In this paper, an online adaptive super twisting sliding mode controller is proposed for a non-linear system. The adaptive controller has been designed in order to deal with the unknown dynamic uncertainties and give the best trajectory tracking. The adaptation is based on an optimal Particle Swarm Optimization (PSO) algorithm whose goal is online tuning the parameters through focusing on decreasing the objective function. The novelty of this study is online handling parameters setting in the conventional super twisting algorithm, bypass heavy offline calculation, and also avoid the instability and abrupt changing of the controller’s parameters for better actuators lifetime. This novel approach has been applied on an upper limb exoskeleton robot for arm rehabilitation. Despite the changes of the dynamic model of the system which defers from one patient to another due to the direct interactions between the wearer and the exoskeleton, this control technique preserves its robustness with respect to bounded external disturbances. The effectiveness of the proposed adaptive controller has been proved in simulation and then in real-time experiment with two human subjects. A comparison between the proposed approach and classic super twisting algorithm has been conducted. The obtained results show the performance and efficiency of the proposed controller.
We establish here an integral inequality for real log-concave functions, which can be viewed as an average monotone likelihood property. This inequality is then applied to examine the monotonicity of failure rates.
The $d$-process generates a graph at random by starting with an empty graph with $n$ vertices, then adding edges one at a time uniformly at random among all pairs of vertices which have degrees at most $d-1$ and are not mutually joined. We show that, in the evolution of a random graph with $n$ vertices under the $d$-process with $d$ fixed, with high probability, for each $j \in \{0,1,\dots,d-2\}$, the minimum degree jumps from $j$ to $j+1$ when the number of steps left is on the order of $\ln (n)^{d-j-1}$. This answers a question of Ruciński and Wormald. More specifically, we show that, when the last vertex of degree $j$ disappears, the number of steps left divided by $\ln (n)^{d-j-1}$ converges in distribution to the exponential random variable of mean $\frac{j!}{2(d-1)!}$; furthermore, these $d-1$ distributions are independent.
It is known that without synchronization via a global clock one cannot obtain common knowledge by communication. Moreover, it is folklore that without communicating higher-level information one cannot obtain arbitrary higher-order shared knowledge. Here, we make this result precise in the setting of gossip where agents make one-to-one telephone calls to share secrets: we prove that “everyone knows that everyone knows that everyone knows all secrets” is unsatisfiable in a logic of knowledge for gossiping. We also prove that, given n agents, $2n-3$ calls are optimal to reach “someone knows that everyone knows all secrets” and that $n - 2 + \binom{n}{2}$ calls are optimal to reach “everyone knows that everyone knows all secrets.”
The European public sector has for a long time tried to change its activities and its relation to the public through the production and provision of data and data-based technologies. Recent debates raised attention to data uses, through which societal value may be realized. However, often absent from these discussions is a conceptual and methodological debate on how to grasp and study such uses. This collection proposes a turn toward data practices—intended here as the analysis of data uses and policies, as they are articulated, understood, or turned into situated activities by different actors in specific contexts, involving organizational rules, socioeconomic factors, discourses, and artifacts. Through a mix of conceptual and methodological studies, the contributions explore how data-driven innovation within public institutions is understood, imagined, planned for, conducted, or assessed. The situations examined in this special issue show, for instance, that data initiatives carried out by different actors lack institutional rules to align data use to the actual needs of citizens; that data scientists are important moral actors whose ethical reasoning should be fostered; and that the materiality of data practices, such as databases, enables and constrains opportunities for public engagement. Collectively, the contributions offer new insights into what constitutes “data-driven innovation practices,” how different practices are assembled, and what their different political, moral, economic, and organizational implications are. The contributions focus on three particular topics of concern: the making of ethical and normative values in practice; organizational collaborations with and around data; and methodological innovations of studying data practices.
This paper addresses the problem of controlling multiple unmanned aerial vehicles (UAVs) cooperating in a formation to carry out a complex task such as surface inspection. We first use the virtual leader-follower model to determine the topology and trajectory of the formation. A double-loop control system combining backstepping and sliding mode control techniques is then designed for the UAVs to track the trajectory. A radial basis function neural network capable of estimating external disturbances is developed to enhance the robustness of the controller. The stability of the controller is proven by using the Lyapunov theorem. A number of comparisons and software-in-the-loop tests have been conducted to evaluate the performance of the proposed controller. The results show that our controller not only outperforms other state-of-the-art controllers but is also sufficient for complex tasks of UAVs such as collecting surface data for inspection. The source code of our controller can be found at https://github.com/duynamrcv/rbf_bsmc.
Cyclic proof systems permit derivations that are finite graphs in contrast to conventional derivation trees. The soundness of such proofs is ensured by imposing a soundness condition on derivations. The most common such condition is the global trace condition (GTC), a condition on the infinite paths through the derivation graph. To give a uniform treatment of such cyclic proof systems, Brotherston proposed an abstract notion of trace. We extend Brotherston’s approach into a category theoretical rendition of cyclic derivations, advancing the framework in two ways: first, we introduce activation algebras which allow for a more natural formalisation of trace conditions in extant cyclic proof systems. Second, accounting for the composition of trace information allows us to derive novel results about cyclic proofs, such as introducing a Ramsey-style trace condition. Furthermore, we connect our notion of trace to automata theory and prove that verifying the GTC for abstract cyclic proofs with certain trace conditions is PSPACE-complete.
Innovation, typically spurred by reusing, recombining and synthesizing existing concepts, is expected to result in an exponential growth of the concept space over time. However, our statistical analysis of TechNet, which is a comprehensive technology semantic network encompassing over 4 million concepts derived from patent texts, reveals a linear rather than exponential expansion of the overall technological concept space. Moreover, there is a notable decline in the originality of newly created concepts. These trends can be attributed to the constraints of human cognitive abilities to innovate beyond an ever-growing space of prior art, among other factors. Integrating creative artificial intelligence into the innovation process holds the potential to overcome these limitations and alter the observed trends in the future.
Visual simultaneous localisation and mapping (vSLAM) has shown considerable promise in positioning and navigating across a variety of indoor and outdoor settings, significantly enhancing the mobility of robots employed in industrial and everyday services. Nonetheless, the prevalent reliance of vSLAM technology on the assumption of static environments has led to suboptimal performance in practical implementations, particularly in unstructured and dynamically noisy environments such as substations. Despite advancements in mitigating the influence of dynamic objects through the integration of geometric and semantic information, existing approaches have struggled to strike an equilibrium between performance and real-time responsiveness. This study introduces a lightweight, multi-modal semantic framework predicated on vSLAM, designed to enable intelligent robots to adeptly navigate the dynamic environments characteristic of substations. The framework notably enhances vSLAM performance by mitigating the impact of dynamic objects through a synergistic combination of object detection and instance segmentation techniques. Initially, an enhanced lightweight instance segmentation network is deployed to ensure both the real-time responsiveness and accuracy of the algorithm. Subsequently, the algorithm’s performance is further refined by amalgamating the outcomes of detection and segmentation processes. With a commitment to maximising performance, the framework also ensures the algorithm’s real-time capability. Assessments conducted on public datasets and through empirical experiments have demonstrated that the proposed method markedly improves both the accuracy and real-time performance of vSLAM in dynamic environments.
The U.S. federal government annually awards billions of dollars as contracts to procure different products and services from external businesses. Although the federal government’s immense purchasing power provides a unique opportunity to invest in the nation’s women-owned businesses (WOBs) and minority-owned businesses (MOBs) and advance the entrepreneurial dreams of many more Americans, gender and racial disparities in federal procurement are pervasive. In this study, we undertake a granular examination of these disparities by analyzing the data on 1,551,610 contracts awarded by 58 different federal government agencies. Specifically, we examine the representation of WOBs and MOBs in contracts with varying levels of STEM intensity and across 19 different contract categories, which capture the wide array of products and services purchased by the federal government. We show that contracts with higher levels of STEM intensity are associated with a lower likelihood of being awarded to WOBs and MOBs. Interestingly, the negative association between a contract’s STEM intensity and its likelihood to be awarded to MOBs is particularly salient for Black-, and Hispanic-owned businesses. Among the 19 categories of contracts, Black-owned businesses are more likely to receive contracts that are characterized by lower median pay levels. Collectively, these results provide data-driven evidence demonstrating the need to make a distinction between the different categories of MOBs and consider the type of products and services being procured while carrying out an examination of racial disparities in federal procurement.
The reconfigurable mechanisms can satisfy the requirements of changing environments, working conditions, and tasks on the function and performance of the mechanism and can be applied to machine tool manufacturing, space detection, etc. Inspired by the single-vertex fivefold origami pattern, a new reconfigurable parallel mechanism is proposed in this paper, which has special singular positions and stable motion due to replicating the stabilizing kinematic properties of origami. Through analyzing the topologic change of the folding process of the pattern and treating it as a reconfigurable joint, a new reconfigurable parallel mechanism with 3, 4, 5, or 6 degrees of freedom is obtained. Then, the kinematics solution, workspace, and singularity of the mechanism are calculated. The results indicate that the singular configuration of the origami-derived reconfigurable parallel mechanism is mainly located in a special plane, and the scope of the workspace is still large after the configuration change. The mechanism has the potential to adapt to multiple tasks and working conditions through the conversion among different configurations by folding reconfigurable joints on the branch chain.
We focus on exponential semi-Markov decision processes with unbounded transition rates. We first provide several sufficient conditions under which the value iteration procedure converges to the optimal value function and optimal deterministic stationary policies exist. These conditions are also valid for general semi-Markov decision processes possibly with accumulation points. Then, we apply our results to a service rate control problem with impatient customers. The resulting exponential semi-Markov decision process has unbounded transition rates, which makes the well-known uniformization technique inapplicable. We analyze the structure of the optimal policy and the monotonicity of the optimal value function by using the customization technique that was introduced by the author in prior work.
The manufacturing of the X-shaped tip of prestressed centrifugal concrete piles is nowadays done half automatically by combining the manual worker and the automatic welding robot. To make this welding process full automatically, the welding seam tracking algorithm is considered. There are many types of sensors that can be used to detect the welding seam such as vision sensor, laser vision sensor, arc sensor, or touch sensor. Each type of sensor has its advantages and disadvantages. In this paper, an algorithm for welding seam tracking using laser distance sensor is proposed. Firstly, the fundamental mathematics theory of the algorithm is presented. Next, the positioning table system supports the procedure is designed and manufactured. The object of this research is the fillet joint because of the characteristics of the X-shaped tip of the concrete piles. This paper proposes a new method to determine the welding trajectory of the tip using laser distance sensor. After that, the experimental results are received to verify the proposed idea. Finally, the improved proposal of the algorithm is considered to increase the accuracy of the suggested algorithm.
We present algorithms and a C code to reveal quantum contextuality and evaluate the contextuality degree (a way to quantify contextuality) for a variety of point-line geometries located in binary symplectic polar spaces of small rank. With this code we were not only able to recover, in a more efficient way, all the results of a recent paper by de Boutray et al. [(2022). Journal of Physics A: Mathematical and Theoretical55 475301], but also arrived at a bunch of new noteworthy results. The paper first describes the algorithms and the C code. Then it illustrates its power on a number of subspaces of symplectic polar spaces whose rank ranges from 2 to 7. The most interesting new results include: (i) non-contextuality of configurations whose contexts are subspaces of dimension 2 and higher, (ii) non-existence of negative subspaces of dimension 3 and higher, (iii) considerably improved bounds for the contextuality degree of both elliptic and hyperbolic quadrics for rank 4, as well as for a particular subgeometry of the three-qubit space whose contexts are the lines of this space, (iv) proof for the non-contextuality of perpsets and, last but not least, (v) contextual nature of a distinguished subgeometry of a multi-qubit doily, called a two-spread, and computation of its contextuality degree. Finally, in the three-qubit polar space we correct and improve the contextuality degree of the full configuration and also describe finite geometric configurations formed by unsatisfiable/invalid constraints for both types of quadrics as well as for the geometry whose contexts are all 315 lines of the space.
Although reasoning about equations over strings has been extensively studied for several decades, little research has been done for equational reasoning on general clauses over strings. This paper introduces a new superposition calculus with strings and present an equational theorem proving framework for clauses over strings. It provides a saturation procedure for clauses over strings and show that the proposed superposition calculus with contraction rules is refutationally complete. In particular, this paper presents a new decision procedure for solving word problems over strings and provides a new method of solving unification problems over strings w.r.t. a set of conditional equations R over strings if R can be finitely saturated under the proposed inference system with contraction rules.
Let $T=(V,E)$ be a tree in which each edge is assigned a cost; let $\mathcal{P}$ be a set of source–sink pairs of vertices in V in which each source–sink pair produces a profit. Given a lower bound K for the profit, the K-prize-collecting multicut problem in trees with submodular penalties is to determine a partial multicut $M\subseteq E$ such that the total profit of the disconnected pairs after removing M from T is at least K, and the total cost of edges in M plus the penalty of the set of still-connected pairs is minimized, where the penalty is determined by a nondecreasing submodular function. Based on the primal-dual scheme, we present a combinatorial polynomial-time algorithm by carefully increasing the penalty. In the theoretical analysis, we prove that the approximation factor of the proposed algorithm is $(\frac{8}{3}+\frac{4}{3}\kappa+\varepsilon)$, where $\kappa$ is the total curvature of the submodular function and $\varepsilon$ is any fixed positive number. Experiments reveal that the objective value of the solutions generated by the proposed algorithm is less than 130% compared with that of the optimal value in most cases.