To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
When we want to compute the probability of a query from a probabilistic answer set program, some parts of a program may not influence the probability of a query, but they impact on the size of the grounding. Identifying and removing them is crucial to speed up the computation. Algorithms for SLG resolution offer the possibility of returning the residual program which can be used for computing answer sets for normal programs that do have a total well-founded model. The residual program does not contain the parts of the program that do not influence the probability. In this paper, we propose to exploit the residual program for performing inference. Empirical results on graph datasets show that the approach leads to significantly faster inference. The paper has been accepted at the ICLP2024 conference and under consideration in Theory and Practice of Logic Programming (TPLP).
The highly digitalised nature of contemporary society has made digital literacy important for newly arrived migrants. However, for teachers, the use of information and communication technologies can be challenging. The aim of the present study is to gain a deeper understanding of how teachers perceive digital resources as useful for teaching migrants language and subject skills. The research question is, In what way do teachers at the language introduction programme for newly arrived migrants in Sweden articulate the use of digital resources in relation to language teaching and in relation to subject teaching? This qualitative study is based on observations of 28 lessons in different subjects in the language introduction programme, as well as interviews with the observed teachers. In analysing the material, we first used the TPACK in situ model (Pareto & Willermark, 2019) to organise the data on the use of digital resources, and thereafter discourse theory (Howarth, 2005) was used to analyse the data. The results show that the teachers limited their students’ use of digital resources during the lessons, which is apparent in two discourses: distrust and dichotomy. In the discourse on distrust, digital technology is seen as an obstacle to teaching, and the discourse dichotomy is about the opposition between the digital and the physical. Moreover, articulations were often expressed in terms of identity; the teachers talked about themselves in relation to digital resources, rather than talking about how they use digital resources in their teaching.
Variable sharing is a fundamental property in the static analysis of logic programs, since it is instrumental for ensuring correctness and increasing precision while inferring many useful program properties. Such properties include modes, determinacy, non-failure, cost, etc. This has motivated significant work on developing abstract domains to improve the precision and performance of sharing analyses. Much of this work has centered around the family of set-sharing domains, because of the high precision they offer. However, this comes at a price: their scalability to a wide set of realistic programs remains challenging and this hinders their wider adoption. In this work, rather than defining new sharing abstract domains, we focus instead on developing techniques which can be incorporated in the analyzers to address aspects that are known to affect the efficiency of these domains, such as the number of variables, without affecting precision. These techniques are inspired in others used in the context of compiler optimizations, such as expression reassociation and variable trimming. We present several such techniques and provide an extensive experimental evaluation of over 1100 program modules taken from both production code and classical benchmarks. This includes the Spectector cache analyzer, the s(CASP) system, the libraries of the Ciao system, the LPdoc documenter, the PLAI analyzer itself, etc. The experimental results are quite encouraging: we have obtained significant speedups, and, more importantly, the number of modules that require a timeout was cut in half. As a result, many more programs can be analyzed precisely in reasonable times.
Minimal models of a Boolean formula play a pivotal role in various reasoning tasks. While previous research has primarily focused on qualitative analysis over minimal models; our study concentrates on the quantitative aspect, specifically counting of minimal models. Exact counting of minimal models is strictly harder than $\#\mathsf{P}$, prompting our investigation into establishing a lower bound for their quantity, which is often useful in related applications. In this paper, we introduce two novel techniques for counting minimal models, leveraging the expressive power of answer set programming: the first technique employs methods from knowledge compilation, while the second one draws on recent advancements in hashing-based approximate model counting. Through empirical evaluations, we demonstrate that our methods significantly improve the lower bound estimates of the number of minimal models, surpassing the performance of existing minimal model reasoning systems in terms of runtime.
We are interested in automating reasoning with and about study regulations, catering to various stakeholders, ranging from administrators, over faculty, to students at different stages. Our work builds on an extensive analysis of various study programs at the University of Potsdam. The conceptualization of the underlying principles provides us with a formal account of study regulations. In particular, the formalization reveals the properties of admissible study plans. With these at end, we propose an encoding of study regulations in Answer Set Programming that produces corresponding study plans. Finally, we show how this approach can be extended to a generic user interface for exploring study plans.
We propose a stable model semantics for higher-order logic programs. Our semantics is developed using Approximation Fixpoint Theory (AFT), a powerful formalism that has successfully been used to give meaning to diverse non-monotonic formalisms. The proposed semantics generalizes the classical two-valued stable model semantics of Gelfond and Lifschitz as well as the three-valued one of Przymusinski, retaining their desirable properties. Due to the use of AFT, we also get for free alternative semantics for higher-order logic programs, namely supported model, Kripke-Kleene, and well-founded. Additionally, we define a broad class of stratified higher-order logic programs and demonstrate that they have a unique two-valued higher-order stable model which coincides with the well-founded semantics of such programs. We provide a number of examples in different application domains, which demonstrate that higher-order logic programming under the stable model semantics is a powerful and versatile formalism, which can potentially form the basis of novel ASP systems.
Answer Set Programming with Quantifiers (ASP(Q)) has been introduced to provide a natural extension of ASP modeling to problems in the polynomial hierarchy (PH). However, ASP(Q) lacks a method for encoding in an elegant and compact way problems requiring a polynomial number of calls to an oracle in $\Sigma _n^p$ (that is, problems in $\Delta _{n+1}^p$). Such problems include, in particular, optimization problems. In this paper, we propose an extension of ASP(Q), in which component programs may contain weak constraints. Weak constraints can be used both for expressing local optimization within quantified component programs and for modeling global optimization criteria. We showcase the modeling capabilities of the new formalism through various application scenarios. Further, we study its computational properties obtaining complexity results and unveiling non-obvious characteristics of ASP(Q) programs with weak constraints.
Environmental data science for spatial extremes has traditionally relied heavily on max-stable processes. Even though the popularity of these models has perhaps peaked with statisticians, they are still perceived and considered as the “state of the art” in many applied fields. However, while the asymptotic theory supporting the use of max-stable processes is mathematically rigorous and comprehensive, we think that it has also been overused, if not misused, in environmental applications, to the detriment of more purposeful and meticulously validated models. In this article, we review the main limitations of max-stable process models, and strongly argue against their systematic use in environmental studies. Alternative solutions based on more flexible frameworks using the exceedances of variables above appropriately chosen high thresholds are discussed, and an outlook on future research is given. We consider the opportunities offered by hybridizing machine learning with extreme-value statistics, highlighting seven key recommendations moving forward.
Many inductive logic programming (ILP) methods are incapable of learning programs from probabilistic background knowledge, for example, coming from sensory data or neural networks with probabilities. We propose Propper, which handles flawed and probabilistic background knowledge by extending ILP with a combination of neurosymbolic inference, a continuous criterion for hypothesis selection (binary cross-entropy) and a relaxation of the hypothesis constrainer (NoisyCombo). For relational patterns in noisy images, Propper can learn programs from as few as 8 examples. It outperforms binary ILP and statistical models such as a graph neural network.
Answer set programming is a well-understood and established problem-solving and knowledge representation paradigm. It has become more prominent amongst a wider audience due to its multiple applications in science and industry. The constant development of advanced programming and modeling techniques extends the toolset for developers and users regularly. This paper compiles and demonstrates different techniques to reuse logic program parts (multi-shot) by solving the arcade game snake. This game is particularly interesting because a victory can be assured by solving the NP-hard problem of Hamiltonian Cycles. We will demonstrate five hands-on implementations in clingo and compare their performance in an empirical evaluation. In addition, our implementation utilizes clingraph to generate a simple yet informative image representation of the game’s progress.
Dung’s abstract Argumentation Framework (AF) has emerged as a key formalism for argumentation in artificial intelligence. It has been extended in several directions, including the possibility to express supports, leading to the development of the Bipolar Argumentation Framework (BAF), and recursive attacks and supports, resulting in the Recursive BAF (Rec-BAF). Different interpretations of supports have been proposed, whereas for Rec-BAF (where the target of attacks and supports may also be attacks and supports) even different semantics for attacks have been defined. However, the semantics of these frameworks have either not been defined in the presence of support cycles or are often quite intricate in terms of the involved definitions. We encompass this limitation and present classical semantics for general BAF and Rec-BAF and show that the semantics for specific BAF and Rec-BAF frameworks can be defined by very simple and intuitive modifications of that defined for the case of AF. This is achieved by providing a modular definition of the sets of defeated and acceptable elements for each AF-based framework. We also characterize, in an elegant and uniform way, the semantics of general BAF and Rec-BAF in terms of logic programming and partial stable model semantics.
Assurance cases offer a structured way to present arguments and evidence for certification of systems where safety and security are critical. However, creating and evaluating these assurance cases can be complex and challenging, even for systems of moderate complexity. Therefore, there is a growing need to develop new automation methods for these tasks. While most existing assurance case tools focus on automating structural aspects, they lack the ability to fully assess the semantic coherence and correctness of the assurance arguments.
In prior work, we introduced the Assurance 2.0 framework that prioritizes the reasoning process, evidence utilization, and explicit delineation of counter-claims (defeaters) and counter-evidence. In this paper, we present our approach to enhancing Assurance 2.0 with semantic rule-based analysis capabilities using common-sense reasoning and answer set programming solvers, specifically s(CASP). By employing these analysis techniques, we examine the unique semantic aspects of assurance cases, such as logical consistency, adequacy, indefeasibility, etc. The application of these analyses provides both system developers and evaluators with increased confidence about the assurance case.
Text-to-image models are enabling efficient design space exploration, rapidly generating images from text prompts. However, many generative AI tools are imperfect for product design applications as they are not built for the goals and requirements of product design. The unclear link between text input and image output further complicates their application. This work empirically investigates design space exploration strategies that can successfully yield product images that are feasible, novel and aesthetic – three common goals in product design. Specifically, users’ actions within the global and local editing modes, including their time spent, prompt length, mono versus multi-criteria prompts, and goal orientation of prompts, are analyzed. Key findings reveal the pivotal role of mono versus multi-criteria and goal orientation of prompts in achieving specific design goals over time and prompt length. The study recommends prioritizing the use of multi-criteria prompts for feasibility and novelty during global editing while favoring mono-criteria prompts for aesthetics during local editing. Overall, this article underscores the nuanced relationship between the AI-driven text-to-image models and their effectiveness in product design, urging designers to carefully structure prompts during different editing modes to better meet the unique demands of product design.
This article considers modular composition as an approach to engendering structural plasticity in musical works. Structural plasticity, in this case, is defined as the ability for the components of a musical work (e.g., events, ideas, sequences, textures, timbres) to vary in how and when they are presented. In this research, modular composition is the process for creating a collection of individual musical ideas (e.g., sequences, patterns, phrases) termed ‘modules’, and designing a dynamic system for their assembly into cohesive structures. This approach results in musical works that exist in a state of constant structural flux, allowing for real-time alteration while progressing beyond similar existing approaches observed in video game music and interactive music apps, from which this research takes inspiration. Approaches involving compositionally focused intelligent music systems are also observed, highlighting how modular composition bridges traditional compositional practices and the design of interactive music systems. Two of the authors’ own works are discussed with regard to how modular composition can be implemented in varying creative ways. The outcome of this work illuminates the creative possibilities of integrating traditional compositional practices with new digital approaches to arrive at a more structurally plastic and alterable form of music.
The logico-algebraic study of Lewis’s hierarchy of variably strict conditional logics has been essentially unexplored, hindering our understanding of their mathematical foundations, and the connections with other logical systems. This work starts filling this gap by providing a logico-algebraic analysis of Lewis’s logics. We begin by introducing novel finite axiomatizations for Lewis’s logics on the syntactic side, distinguishing between global and local consequence relations on Lewisian sphere models on the semantical side, in parallel to the case of modal logic. As first main results, we prove the strong completeness of the calculi with respect to the corresponding semantical consequence on spheres, and a deduction theorem. We then demonstrate that the global calculi are strongly algebraizable in terms of a variety of Boolean algebras with a binary operator representing the counterfactual implication; in contrast, we show that the local ones are generally not algebraizable, although they can be characterized as the degree-preserving logic over the same algebraic models. This yields the strong completeness of all the logics with respect to the algebraic models.
Solving a decision theory problem usually involves finding the actions, among a set of possible ones, which optimize the expected reward, while possibly accounting for the uncertainty of the environment. In this paper, we introduce the possibility to encode decision theory problems with Probabilistic Answer Set Programming under the credal semantics via decision atoms and utility attributes. To solve the task, we propose an algorithm based on three layers of Algebraic Model Counting, that we test on several synthetic datasets against an algorithm that adopts answer set enumeration. Empirical results show that our algorithm can manage non-trivial instances of programs in a reasonable amount of time.
Forecasting international migration is a challenge that, despite its political and policy salience, has seen a limited success so far. In this proof-of-concept paper, we employ a range of macroeconomic data to represent different drivers of migration. We also take into account the relatively consistent set of migration policies within the European Common Market, with its constituent freedom of movement of labour. Using panel vector autoregressive (VAR) models for mixed-frequency data, we forecast migration in the short- and long-term horizons for 26 of the 32 countries within the Common Market. We demonstrate how the methodology can be used to assess the possible responses of other macroeconomic variables to unforeseen migration events—and vice versa. Our results indicate reasonable in-sample performance of migration forecasts, especially in the short term, although with varying levels of accuracy. They also underline the need for taking country-specific factors into account when constructing forecasting models, with different variables being important across the regions of Europe. For the longer term, the proposed methods, despite high prediction errors, can still be useful as tools for setting coherent migration scenarios and analysing responses to exogenous shocks.
Cable-guiding mechanisms (CGMs) and the stiffness characteristics directly influence the dynamic features of the cable-driven upper limb rehabilitation robot (PCUR), which will affect PCUR’s performance. This paper introduces a novel CGM design. Given the precision and movement stability considerations of the mechanism, an analytical model is developed. Using this model, we analyze the error of the CGM and derive velocity and acceleration mappings from the moving platform to the cables. Continuity of cable trajectory and tension is rigorously demonstrated. Subsequently, a mathematical model for PCUR stiffness is formulated. Utilizing MATLAB/Simscape Multibody, simulation models for the CGM and stiffness characteristics are constructed. The feasibility of the proposed CGM design is validated through simulation and experimentation, while the influence of stiffness characteristics on PCUR motion stability is comprehensively analyzed.