scholarly journals Comparing logic programming and formal argumentation; the case of ideal and eager semantics

2021 ◽  
pp. 1-28
Author(s):  
Martin Caminada ◽  
Sri Harikrishnan ◽  
Samy Sá

The connection between logic programming and formal argumentation has been studied starting from the landmark 1995 paper of Dung. Subsequent work has identified a standard translation from logic programs to (instantiated) argumentation frameworks, under which pairwise correspondences hold between various logic programming semantics and various formal argumentation semantics. This includes the correspondence between 3-valued stable and complete semantics, between well-founded and grounded semantics and between 2-valued stable (LP) and stable (argumentation) semantics. In the current paper, we show that the existing translation is able to yield the additional correspondence between ideal semantics for logic programming and ideal semantics for formal argumentation. We also show that correspondence does not hold between eager semantics for logic programming and eager semantics for formal argumentation, at least when translating from logic programming to formal argumentation. Overall, the current work should be seen as completing the analysis of correspondences between mainstream admissibility-based argumentation semantics and their logic programming counterparts.

2017 ◽  
Vol 17 (02) ◽  
pp. e16
Author(s):  
Sergio Alejandro Gómez

We present an approach for performing instance checking in possibilistic description logic programming ontologies by accruing arguments that support the membership of individuals to concepts. Ontologies are interpreted as possibilistic logic programs where accruals of arguments as regarded as vertexes in an abstract argumentation framework. A suitable attack relation between accruals is defined. We present a reasoning framework with a case study and a Java-based implementation for enacting the proposed approach that is capable of reasoning under Dung’s grounded semantics.


2019 ◽  
Vol 19 (5-6) ◽  
pp. 671-687 ◽  
Author(s):  
JORGE FANDINNO

AbstractIn a recent line of research, two familiar concepts from logic programming semantics (unfounded sets and splitting) were extrapolated to the case of epistemic logic programs. The property of epistemic splitting provides a natural and modular way to understand programs without epistemic cycles but, surprisingly, was only fulfilled by Gelfond’s original semantics (G91), among the many proposals in the literature. On the other hand, G91 may suffer from a kind of self-supported, unfounded derivations when epistemic cycles come into play. Recently, the absence of these derivations was also formalised as a property of epistemic semantics called foundedness. Moreover, a first semantics proved to satisfy foundedness was also proposed, the so-called Founded Autoepistemic Equilibrium Logic (FAEEL). In this paper, we prove that FAEEL also satisfies the epistemic splitting property something that, together with foundedness, was not fulfilled by any other approach up to date. To prove this result, we provide an alternative characterisation of FAEEL as a combination of G91 with a simpler logic we called Founded Epistemic Equilibrium Logic (FEEL), which is somehow an extrapolation of the stable model semantics to the modal logic S5.


1992 ◽  
Vol 16 (3-4) ◽  
pp. 231-262
Author(s):  
Philippe Balbiani

The beauty of modal logics and their interest lie in their ability to represent such different intensional concepts as knowledge, time, obligation, provability in arithmetic, … according to the properties satisfied by the accessibility relations of their Kripke models (transitivity, reflexivity, symmetry, well-foundedness, …). The purpose of this paper is to study the ability of modal logics to represent the concepts of provability and unprovability in logic programming. The use of modal logic to study the semantics of logic programming with negation is defended with the help of a modal completion formula. This formula is a modal translation of Clack’s formula. It gives soundness and completeness proofs for the negation as failure rule. It offers a formal characterization of unprovability in logic programs. It characterizes as well its stratified semantics.


Author(s):  
Andrew Cropper ◽  
Sebastijan Dumančic

A major challenge in inductive logic programming (ILP) is learning large programs. We argue that a key limitation of existing systems is that they use entailment to guide the hypothesis search. This approach is limited because entailment is a binary decision: a hypothesis either entails an example or does not, and there is no intermediate position. To address this limitation, we go beyond entailment and use 'example-dependent' loss functions to guide the search, where a hypothesis can partially cover an example. We implement our idea in Brute, a new ILP system which uses best-first search, guided by an example-dependent loss function, to incrementally build programs. Our experiments on three diverse program synthesis domains (robot planning, string transformations, and ASCII art), show that Brute can substantially outperform existing ILP systems, both in terms of predictive accuracies and learning times, and can learn programs 20 times larger than state-of-the-art systems.


2018 ◽  
Vol 19 (2) ◽  
pp. 262-289 ◽  
Author(s):  
ELIAS MARCOPOULOS ◽  
YUANLIN ZHANG

AbstractRecent progress in logic programming (e.g. the development of the answer set programming (ASP) paradigm) has made it possible to teach it to general undergraduate and even middle/high school students. Given the limited exposure of these students to computer science, the complexity of downloading, installing, and using tools for writing logic programs could be a major barrier for logic programming to reach a much wider audience. We developed onlineSPARC, an online ASP environment with a self-contained file system and a simple interface. It allows users to type/edit logic programs and perform several tasks over programs, including asking a query to a program, getting the answer sets of a program, and producing a drawing/animation based on the answer sets of a program.


Author(s):  
Farhad Shakerin ◽  
Gopal Gupta

We present a heuristic based algorithm to induce nonmonotonic logic programs that will explain the behavior of XGBoost trained classifiers. We use the technique based on the LIME approach to locally select the most important features contributing to the classification decision. Then, in order to explain the model’s global behavior, we propose the LIME-FOLD algorithm —a heuristic-based inductive logic programming (ILP) algorithm capable of learning nonmonotonic logic programs—that we apply to a transformed dataset produced by LIME. Our proposed approach is agnostic to the choice of the ILP algorithm. Our experiments with UCI standard benchmarks suggest a significant improvement in terms of classification evaluation metrics. Meanwhile, the number of induced rules dramatically decreases compared to ALEPH, a state-of-the-art ILP system.


2011 ◽  
Vol 11 (2-3) ◽  
pp. 263-296 ◽  
Author(s):  
SHAY B. COHEN ◽  
ROBERT J. SIMMONS ◽  
NOAH A. SMITH

AbstractWeighted logic programming, a generalization of bottom-up logic programming, is a well-suited framework for specifying dynamic programming algorithms. In this setting, proofs correspond to the algorithm's output space, such as a path through a graph or a grammatical derivation, and are given a real-valued score (often interpreted as a probability) that depends on the real weights of the base axioms used in the proof. The desired output is a function over all possible proofs, such as a sum of scores or an optimal score. We describe the product transformation, which can merge two weighted logic programs into a new one. The resulting program optimizes a product of proof scores from the original programs, constituting a scoring function known in machine learning as a “product of experts.” Through the addition of intuitive constraining side conditions, we show that several important dynamic programming algorithms can be derived by applying product to weighted logic programs corresponding to simpler weighted logic programs. In addition, we show how the computation of Kullback–Leibler divergence, an information-theoretic measure, can be interpreted using product.


2011 ◽  
Vol 13 (1) ◽  
pp. 107-142 ◽  
Author(s):  
FREDERICK MAIER

AbstractWe provide a method of translating theories of Nute's defeasible logic into logic programs, and a corresponding translation in the opposite direction. Under certain natural restrictions, the conclusions of defeasible theories under the ambiguity propagating defeasible logic ADL correspond to those of the well-founded semantics for normal logic programs, and so it turns out that the two formalisms are closely related. Using the same translation of logic programs into defeasible theories, the semantics for the ambiguity blocking defeasible logic NDL can be seen as indirectly providing an ambiguity blocking semantics for logic programs. We also provide antimonotone operators for both ADL and NDL, each based on the Gelfond–Lifschitz (GL) operator for logic programs. For defeasible theories without defeaters or priorities on rules, the operator for ADL corresponds to the GL operator and so can be seen as partially capturing the consequences according to ADL. Similarly, the operator for NDL captures the consequences according to NDL, though in this case no restrictions on theories apply. Both operators can be used to define stable model semantics for defeasible theories.


2007 ◽  
Vol 7 (3) ◽  
pp. 301-353 ◽  
Author(s):  
NIKOLAY PELOV ◽  
MARC DENECKER ◽  
MAURICE BRUYNOOGHE

AbstractIn this paper, we present a framework for the semantics and the computation of aggregates in the context of logic programming. In our study, an aggregate can be an arbitrary interpreted second order predicate or function. We define extensions of the Kripke-Kleene, the well-founded and the stable semantics for aggregate programs. The semantics is based on the concept of a three-valuedimmediate consequence operatorof an aggregate program. Such an operatorapproximatesthe standard two-valued immediate consequence operator of the program, and induces a unique Kripke-Kleene model, a unique well-founded model and a collection of stable models. We study different ways of defining such operators and thus obtain a framework of semantics, offering different trade-offs betweenprecisionandtractability. In particular, we investigate conditions on the operator that guarantee that the computation of the three types of semantics remains on the same level as for logic programs without aggregates. Other results show that, in practice, even efficient three-valued immediate consequence operators which are very low in the precision hierarchy, still provide optimal precision.


2017 ◽  
Vol 17 (5-6) ◽  
pp. 906-923 ◽  
Author(s):  
EKATERINA KOMENDANTSKAYA ◽  
YUE LI

AbstractLogic Programming is a Turing complete language. As a consequence, designing algorithms that decide termination and non-termination of programs or decide inductive/coinductive soundness of formulae is a challenging task. For example, the existing state-of-the-art algorithms can only semi-decide coinductive soundness of queries in logic programming for regular formulae. Another, less famous, but equally fundamental and important undecidable property is productivity. If a derivation is infinite and coinductively sound, we may ask whether the computed answer it determines actually computes an infinite formula. If it does, the infinite computation is productive. This intuition was first expressed under the name of computations at infinity in the 80s. In modern days of the Internet and stream processing, its importance lies in connection to infinite data structure processing. Recently, an algorithm was presented that semi-decides a weaker property – of productivity of logic programs. A logic program is productive if it can give rise to productive derivations. In this paper, we strengthen these recent results. We propose a method that semi-decides productivity of individual derivations for regular formulae. Thus, we at last give an algorithmic counterpart to the notion of productivity of derivations in logic programming. This is the first algorithmic solution to the problem since it was raised more than 30 years ago. We also present an implementation of this algorithm.


Sign in / Sign up

Export Citation Format

Share Document