Archive of events from 2022
An archive of events from the year
-
Public defence of Giacomo Barlucchi’s Licentiate thesis.
Opponent Docent Sebastian Enqvist, Stockholms universitet
Examiner Docent Fredrik Engström, Göteborgs universitet -
I present a topological epistemic logic, motivated by a famous epistemic puzzle: the Surprise Exam Paradox. It is a fixed-point modal logic, with modalities for knowledge (modelled as the universal modality), knowability of a proposition (represented by the topological interior operator), and (un)knowability of the actual world. The last notion has a non-self-referential reading (modelled by Cantor derivative: the set of limit points of a given set) and a self-referential one (modelled by the so-called perfect core of a given set: its largest subset which is a fixed point of relativized derivative). I completely axiomatize this logic, showing that it is decidable and PSPACE-complete, as well as briefly explain how the same model-theoretic method can be elaborated to prove the completeness and decidability of the full topological mu-calculus. Finally, I apply it to the analysis of the Surprise Exam Paradox and other puzzles.
References:
- A. Baltag, N. Bezhanishvili, D. Fernández-Duque. The Topology of Surprise. Proceedings of the International Conference on Principles of Knowledge Representation and Reasoning. Vol. 19 (1), 33-42, 2022. Available online in ILLC Prepublication (PP) series PP-2022-06.
- A. Baltag, N. Bezhanishvili, D. Fernández-Duque. The Topological Mu-Calculus: Completeness and Decidability. LICS ‘21: Proceedings of the 36th Annual ACM/IEEE Symposium on Logic in Computer Science, vol 89: 1-13, 2021. doi:10.1109/lics52264.2021.9470560. Available online in ILLC Prepublication (PP) series PP-2021-07.
-
-
I will first give a brief background, with some examples, on ‘Carnap’s problem’: to what extent a consequence relation in a formal language fixes the meaning, relative to some given semantics, of the logical constants in that language. I then focus on intuitionistic propositional logic and show that it is ‘Carnap categorical’: the only interpretation of the connectives consistent with the usual intuitionistic consequence relation is the standard one. This holds relative to most well-known semantics with respect to which intuitionistic logic is sound and complete; among them Kripke semantics, Beth semantics, Dragalin semantics, and topological semantics. It also holds for algebraic semantics, although categoricity in that case is different in kind from categoricity relative to possible worlds style semantics. This is joint work with Haotian Tong.
-
In the book Das Kontinuum (1918), Hermann Weyl presents a coherent and sophisticated approach to analysis from a predicativist perspective. In the first chapter of (Weyl 1918), Weyl introduces a predicative concept of set, according to which sets are built ‘from the bottom up’ starting from the natural numbers. Weyl clearly contrasts this predicative concept of set with the concept of arbitrary set, which he finds wanting, especially when working with infinite sets. In the second chapter of Das Kontinuum, he goes on to show that large portions of 19th century analysis can be developed on the basis of his predicative concept of set.
Das Kontinuum inspired fundamental ideas in mathematical logic and beyond, such as the logical analysis of predicativity of the 1950-60’s, Solomon Feferman’s work on predicativity and Errett Bishop’s constructive mathematics. The seeds of Das Kontinuum are already visible in the early (Weyl 1910), where Weyl, among other things, offers a clarification of Zermelo’s axiom schema of Separation.
In this talk, I examine Weyl’s predicative concept of set in (Weyl 1918) and discuss its origins in (Weyl 1910).
Bibliography
- Weyl, H., 1910, Über die Definitionen der mathematischen Grundbegriffe, Mathematischnaturwissenschaftliche Blätter, 7, pp. 93-95 and pp. 109-113.
- Weyl, H., 1918, Das Kontinuum. Kritische Untersuchungen über die Grundlagen der Analysis, Veit, Leipzig. Translated in English, Dover Books on Mathematics, 2003.
-
Ill-founded and cyclic proof systems have proven to be fruitful alternatives to traditional, finitary proof systems when dealing with modal fixed point logics. The question arises whether ill-founded and cyclic systems can be also be designed for intuitionistic versions of these logics. In this talk, I will present ongoing research aimed at developing an ill-founded sequent calculus for intuitionistic linear temporal logic. In particular, I will focus on our attempt to show completeness of such a calculus using backwards proof search. This is joint work with Lukas Zenger, Bahareh Afshari and Graham Leigh.
-
In standard proof theory a proof is a finite tree whose nodes are labelled by sequents and which is generated by a finite set of rules. One such rule is the cut rule, which is a sequent calculus analogue of the well-known modus ponens rule. As the cut rule introduces new formulas into a proof, it makes proofs non-analytic. Therefore, much effort is devoted to show that the cut rule can be eliminated from a given calculus without changing the set of derivable sequents. Such a cut-elimination theorem is established by showing how instances of the cut rule in a given proof can be pushed upwards until the premises of the instances are axioms. One then shows that in this case the instances of cut can be eliminated from the proof. However, this standard strategy does not work when one considers so-called non-wellfounded proofs, which are proofs in which some branches are allowed to be infinitely long. Clearly, in such a setting it is not enough to simply push instances of cut upwards, as a leaf might never be reached. In this talk I will present ongoing research on cut-elimination for non-wellfounded proofs for a fragment of the modal mu-calculus. I will discuss the basic idea how to construct a cut-free proof from a given non-wellfounded proof and present some of the current problems of my method.
-
Modal fixed point logics are extensions of modal logics with operators denoting least, respectively greatest, fixed points of certain positive formulas. Logics of this kind include temporal logics, logics of common knowledge, program logics, and most famously, the modal mu-calculus. We present a survey of deductive systems for the logic of common knowledge as a typical example of a model fixed point logic. In particular, we present two different Hilbert-style axiomatizations and several infinitary cut-free sequent systems that are based on omega-rules. Further, we discuss the problem of syntactic cut-elimination for common knowledge and its generalization to the modal mu-calculus.
-
Strict/tolerant logic, ST, has been of considerable interest in the last few years, in part because it forces consideration of what it means for two logics to be different, or the same. And thus, of what it means to be a logic. The basic idea behind ST is that it evaluates the premises and the conclusions of its consequence relation differently, with the premises held to stricter standards while conclusions are treated more tolerantly. More specifically, ST is a three-valued logic with left sides of sequents understood as if in Kleene’s Strong Three Valued Logic, and right sides as if in Priest’s Logic of Paradox. Surprisingly, this hybrid validates the same sequents that classical logic does, though it differs from classical logic at the metaconsequence level. A version of this result has been extended to meta, metameta , etc. consequence levels, creating a very interesting hierarchy of logics. All this is work of others, and I will summarize it.
My contribution to the subject is to show that the original ideas behind ST are, in fact, much more general than it first seemed, and an infinite family of many valued logics have Strict/Tolerant counterparts. Besides classical logic, this family includes both Kleene’s and Priest’s logics themselves, as well as first degree entailment. For instance, for both the Kleene and the Priest logic, the corresponding strict/tolerant logic is six-valued, but with differing sets of strictly and tolerantly designated truth values. There is a reverse notion, of Tolerant/Strict logics, which exist for the same structures. And the hierarchy going through meta, metameta, \ldots consequence levels exists for the same infinite family of many valued logics. In a similar way all this work extends to modal and quantified many valued logics. In brief, we have here a very general phenomenon.
I will present a sketch of the basic generalizations, of Strict/Tolerant and Tolerant/Strict, but I will not have time to discuss the hierarchies of such logics, nor will I have time to give proofs, beyond a basic sketch of the ideas involved.
-
Full Computation Tree Logic, commonly denoted CTL*, extends Linear Temporal Logic LTL by path quantification for reasoning about branching time. In contrast to traditional Computation Tree Logic CTL, path quantifiers may be freely combined with temporal operators, resulting in a more expressive language. We present a sound and complete hypersequent calculus for CTL*. The proof system is cyclic in the sense that proofs are finite trees with back-edges. A simple annotation mechanism on formulas provides a syntactic success condition on non-axiomatic leaves to guarantee soundness. Completeness is established by relating cyclic proofs to a natural ill-founded sequent calculus for the logic. This is joint work with B. Afshari and Graham E. Leigh.
-
-
The study of Gödel’s shorthand notebooks in the ERC project GODELIANA has revealed two main aspects of his work: First, there is behind each of his relatively few published papers an enormous amount of notes. They typically reveal how he arrived at his results, as with the incompleteness theorem. An extreme case are Gödel’s 1938 results on set theory, preceded by some 700 pages of notes never studied before. Secondly, his threshold for publishing was by 1940 so high that some two thousand pages of developments and results remained unknown. One highlight here is a series of notebooks titled “Resultate Grundlagen” in which numerous anticipations of later results in set theory are found. A second main topic are the relations between modal and intuitionistic logic. In particular, Gödel gives a syntactic translation from S4 to intuitionistic logic by methods that are readily generalizable to any decent intermediate logics. These newly discovered methods are, even by today’s standards, a clear step ahead in the study of interrelations between systems of non-classical logics.
-
(Dis)embodiment is a conference organized by the Centre for Linguistic Theory and Studies in Probability (CLASP), at the Department of Philosophy, Linguistics and Theory of Science. The conference will be held between September 14 to September 16. See the conference webpage for more details.
-
We describe a theory of computable functionals (TCF) which extends Heyting’s arithmetic in all simple types by (i) adding inductively and coinductively defined predicates, (ii) distinguishing computationally relevant (c.r.) and non-computational (n.c.) predicates, (iii) adding realizability predicates, and (iv) allowing partial functionals defined by equations (possibly non-terminating, like corecursion). The underlying (minimal) logic has just implication and universal quantification as primitive connectives; existence, disjunction and conjunction are inductively defined. The axioms of TCF are the defining axioms for (co)inductive predicates, bisimilarity axioms and invariance axioms stating that ‘‘to assert is to realize’’ (Feferman 1978) for realizability-free formulas. Using these one can prove in TCF a soundness theorem: the term extracted from a realizability-free proof of a formula A is a realizer of A. TCF is implemented in the Minlog proof assistant.
-
Motivated by the difficulty in proving faithfulness of various modal embeddings (starting with Gödel’s translation of intuitionistic logic into S4), we use labelled calculi to obtain simple and uniform faithfulness proofs for the embedding of intermediate logics into their modal companions, and of intuitionistic logic into provability logic, including extensions to infinitary logics.
-
Syntax and semantics, often considered as conflicting aspects of logic, have turned out to be intertwined in a methodology for generating complete proof systems for wide families of non-classical logics. In this formal semantics, models can be considered as purely mathematical objects with no ontological assumptions upon them. More specifically, by the “labelled formalism”, which now is a well-developed methodology, the semantics is turned into an essential component in the syntax of logical calculi. Thus enriched, the calculi not only constitute a tool for the automatisation of reasoning, but can also be used at the meta-level to establish general structural properties of logical systems and direct proofs of completeness up to decidability in the terminating case. The calculi, on the other hand, can be used to find simplified models through conservativity results. The method will be illustrated with gradually generalised semantics, including topological ones such as neighbourhood semantics.
-
Conservative extensions are a fundamental notion in logic. In the area of description logic, deciding whether a logical theory is a conservative extension of another theory is a fundamental reasoning problem with applications in ontology engineering tasks, such as ontology modularity, ontology reuse and ontology versioning. It has been observed in recent years that conservative extensions are decidable in many modal and description logics, given that they can often be characterized elegantly in terms of appropriate notions of bisimulations. But no work has been done for more expressive logics such as the two-variable fragment and the guarded fragment, which are considered to be generalizations of modal and description logics, and are typically used to explain their good computational behavior.
</p>
In this talk, I attempt to fill this gap by considering the decidability of conservative extensions in the two-variable guarded fragment (
gfo2 ). I will show that conservative extensions are decidable ingfo 2 and that the computational complexity is 2exptime -complete. To prove these results I will rely on an automata-based approach, which consists in first giving a model-theoretic characterization of conservative extensions in terms of guarded bisimulations, to then use it as a foundation for a decision procedure based on tree automata. Since the usual notion of guarded bisimulations fail to characterize conservative extensions ingfo2 , I will report on alternative model-theoretic characterizations in terms of bounded guarded bisimulations and the challenge of combining them with automata techniques.This talk is based on joint work with Carsten Lutz, Jean Christoph Jung, Thomas Schneider, and Frank Wolter: https://drops.dagstuhl.de/opus/volltexte/2017/7464/pdf/LIPIcs-ICALP-2017-108.pdf
-
Aristotle famously claimed that the only coherent form of infinity is potential, not actual. However many objects there are, it is possible for there to be yet more; but it is impossible for there in fact to be infinitely many objects. Although this view was superseded by Cantor’s transfinite set theory, even Cantor regarded the collection of all sets as “unfinished” or incapable of “being together”. In recent years, there has been a revival of interest in potentialist approaches to the philosophy and foundations of mathematics. The lecture provides a survey of such approaches, covering both technical results and associated philosophical views, as these emerge both in published work and in work in progress.
-
In constructive type theory and explicit mathematics, the idea of universes has been used to strengthen the expressive power or the proof-theoretic strength of a given system. Similarly, Cantini (1996) and Kahle (2003) introduced universes for axiomatic truth theories over TON (total applicative theory). Set-theoretic universes are usually large sets that are closed under several operations like complements, whereas truth-theoretic universes can be seen as quasi-truth predicates which satisfy some axioms for the truth predicate. Cantini and Kahle then showed that by adding the axiom assuring the existence of universes, several truth theories become proof-theoretically stronger.
In this talk, I will formulate the theory VFU, an extension of Cantini’s supervaluational truth theory VF (cf. Kahle, 2002) with the universe-generating axiom, and then prove that VFU has the same proof-theoretic strength as KPi (Kripke-Platek set theory with the recursively inaccessible universe) and T0 (Feferman’s explicit mathematics with the inductive generation axiom). To determine the lower bound, I show that T0 is relatively interpretable in VFU. As for the upper bound, I give a truth-as-provability interpretation for VFU, where the truth predicate is understood as the provability in a certain infinite derivation system. As this is formalisable in KPi, a relative interpretation of VFU in KPi is obtained. Therefore, all the three systems are proof-theoretically equivalent. Finally, I would like to mention the possibility of constructing natural numbers in VFU.
-
What is a Gödel sentence? Is there such a thing as the Gödel sentence of a given theory?
Lajevardi and Salehi, in a short paper from last year, argue against the use of the definite article in the expression ‘the Gödel sentence’, by claiming that any unsound theory has Gödelian sentences with different truth values. We show that the two theorems of their paper are special cases (modulo Löb’s theorem and the first incompleteness theorem) of general observations pertaining to fixed points of any formula, and argue that the false sentences of Lajevardi and Salehi are in fact not Gödel sentences. We conclude with a discussion of the roles played by soundness and truth in drawing further consequences from the incompleteness theorems.
This paper on which this talk is based is joint work with Christian Bennet.
-
Progressions of theories along paths through Kleene’s Omega adding the consistency of the previous theory at every successor step, can deduce every true \( \Pi^0_1 \)-statement. This was shown by Turing in his 1938 thesis who called these progressions “ordinal logics”. In 1962 Feferman proved the amazing theorem that progressions based on the “uniform reflection principle” can deduce every true arithmetic statement. In contrast to Turing’s, Feferman’s proof is very complicated, involving several cunning applications of self-reference via the recursion theorem. Using Schütte’s method of search trees (or decomposition trees) for omega-logic and reflexive induction, however, one can give a rather transparent proof.
-
I present an untyped theory of knowledge and truth that solves the knower paradoxes of Kaplan and Montague from the 1960’s. The underlying idea is (1) to formalize the principle of veracity (that whatever is known is true) more precisely, and (2) to embrace self-reference in the spirit of the Friedman-Sheard theory of truth and its associated revision semantics. It turns out that this facilitates expedient reasoning with common-knowledge predicates defined by self-referential formulas obtained by Gödel diagonalization. Moreover, the revision semantics is generalized to accommodate my theory. Apart from answering questions of consistency, this opens up for philosophical insights on the meaning of sentences involving iterations of knowledge and truth, such as ‘“Kim knows A” is true’ and ‘Kim knows “Kim knows A”’.
-
The practice of foundations of mathematics is built around a firm distinction between syntax and semantics. But how stable is this distinction, and is it always the case that semantically presented mathematical objects in the form e.g. of a model class might give rise to a “natural logic”? In this talk I will investigate different scenarios from set and model theory in which an investigation of the notion of an implicit or internal logic or syntax becomes possible. Time permitting we will also discuss the question whether logics without a syntax can be considered logics at all.
-
Substructural logics are logics with restricted use of structural rules like weakening, contraction and exchange. There are several motivations for investigating substructural logics from the study of algebraic structures like residuated (semi)lattices to the study of type systems which allow for greater control over the number of times and the order in which variables are used.
</p>
Enriched with fixed point operators, these logics are highly expressive. In this talk, I will give a brief overview of substructural logics with fixed points and concentrate specifically on linear logic with fixed points discussing various proof systems, their expressivity, and semantics.
-
Transitive Closure Logic (TCL) enriches the language of first order logic with a recursive operator, expressing the transitive closure of a binary relation. Cyclic sequent calculi proofs for this logic have been introduced in the literature. In this talk, I will present a new cyclic proof system for TCL, which employs hypersequents. The main motivation behind this work is that, differently from sequent TCL proofs, hypersequent proofs for TCL successfully simulate cyclic proofs in the transitive fragment of Propositional Dynamic Logic (PDL+), without using the non-analytic rule of cut.
</p>
After introducing the cyclic hypersequent calculus for TCL, I will present two main results. First, I will sketch a proof of soundness for the calculus, which requires a more involved argument than the one used in traditional cyclic proofs. Second, I will show how PDL+ cyclic proofs can be translated into TCL hypersequent proofs. This result allows to lift the completeness result for cyclic PDL+ to the hypersequent calculus. This talk is based on joint work with Anupam Das.
-
The first part will be about representation of mathematics on a computer. Questions that arise there are naturally reminiscent of issues that arise when teaching formal proofs in a basic logic course, e.g. how to deal with free and bound variables, and instantiation rules. As discussed in a 1962 paper of Tarski, these issues are “clearly experienced both in teaching an elementary course in mathematical logic and in formalizing the syntax of predicate logic for some theoretical purposes.” I will present two quite different approaches to this problem: one inspired by Tarski’s paper (N. Megill, system Metamath) and one using dependent type theory (N.G. de Bruijn).
The second part will then try to explain how notations introduced by dependent type theory suggest new insights for old questions coming from Principia Mathematica (extensionality, reducibility axiom) through the notion of universe, introduced by Grothendieck for representing category theory in set theory, and introduced in dependent type theory by P. Martin-Löf.
-
-
Description Logics (DLs) are a family of knowledge representation languages, used to reason with information sets in a structured and well-understood way. They play a big role in any field using large amounts of data, e.g. artificial intelligence, databases, and they are of importance in providing a logical formalism for ontologies and the Semantic Web. A common battle in the world of DLs is the one between expressivity, where we add fixpoints, inverses, role restrictions, etc, to a standard DL, and decidability. That is why more expressive as well as light-weight DLs are considered, which both benefit from different derivation system. In this talk I will first give an introduction on the general area of Description Logic, and then focus on the different reasoning problems and how they are solved proof-theoretically. The main result will be on Martin Hofmann’s paper, presenting the sequent calculus for the light description logic EL with the fixpoint extension, and see how cyclic proofs are used in this context. This talk is based on the reading course I did together with Bahareh Afshari as a preparation for my Master thesis.
-
Reasoning with generalized quantifiers in natural language combines logical and arithmetical features, transcending divides between qualitative and quantitative. This practice blends with inference patterns in ‘grassroots mathematics’ such as pigeon-hole principles. Our topic is this cooperation of logic and counting on a par, studied with small systems and gradually moving upward.
We start with monadic first-order logic with counting. We provide normal forms that allow for axiomatization, determine which arithmetical notions are definable, and conversely, discuss which logical notions and reasoning principles can be defined out of arithmetical ones. Next we study a series of strengthenings in the same style, including second-order versions, systems with multiple counting, and a new modal logic with counting. As a complement to our fragment approach, we also discuss another way of controlling complexity: changing the semantics of counting to reason about ‘mass’ or other aggregating notions than cardinalities. Finally, we return to the basic reasoning practices that lie embedded in natural language, confronting our formal systems with linguistic quantifier vocabulary, monotonicity reasoning, and procedural semantics via semantic automata. We conclude with some pointers to further entanglements of logic and counting in the metamathematics of formal systems, the philosophy of logic, and cognitive psychology. (Joint work with Thomas Icard)
Paper available at: https://eprints.illc.uva.nl/id/eprint/1813/1/Logic.Counting.pdf