Nordic Online Logic Seminar - page 3
An online seminar for logicians and logic aficionados worldwide.
The Nordic Online Logic Seminar (NOL Seminar) is a monthly seminar series initiated in 2021 presenting expository talks by logicians on topics of interest for the broader logic community. Initially the series focused on activities of the Nordic logic groups, but has since expanded to offer a variety of talks from logicians around the world. The seminar is open to professional or aspiring logicians and logic aficionados worldwide.
The tentative time slot is Monday, 16.00-17.30 (Stockholm/Sweden time). If you wish to receive the Zoom ID and password for it, as well as regular announcements, please subscribe to the NOL Seminar mailing list.
NOL seminar organisers
Valentin Goranko and Graham Leigh
-
Compositional semantics that acknowledge vagueness by positing degrees of truth intermediate between truth and falsity can retain classical sentential calculus, provided the degrees form a Boolean algebra. A valid deduction guarantees that the degree of truth of the conclusion be at least as great as every lower bound on the degrees of the premises. If we extend the language to allow infinite disjunctions and conjunctions, the Boolean algebra will need to be complete and atomic. If we extend further by adding quantifiers ranging over a fixed domain, we get the supervaluations proposed by Bas van Fraassen.
-
I present a topological epistemic logic, motivated by a famous epistemic puzzle: the Surprise Exam Paradox. It is a fixed-point modal logic, with modalities for knowledge (modelled as the universal modality), knowability of a proposition (represented by the topological interior operator), and (un)knowability of the actual world. The last notion has a non-self-referential reading (modelled by Cantor derivative: the set of limit points of a given set) and a self-referential one (modelled by the so-called perfect core of a given set: its largest subset which is a fixed point of relativized derivative). I completely axiomatize this logic, showing that it is decidable and PSPACE-complete, as well as briefly explain how the same model-theoretic method can be elaborated to prove the completeness and decidability of the full topological mu-calculus. Finally, I apply it to the analysis of the Surprise Exam Paradox and other puzzles.
References:
- A. Baltag, N. Bezhanishvili, D. Fernández-Duque. The Topology of Surprise. Proceedings of the International Conference on Principles of Knowledge Representation and Reasoning. Vol. 19 (1), 33-42, 2022. Available online in ILLC Prepublication (PP) series PP-2022-06.
- A. Baltag, N. Bezhanishvili, D. Fernández-Duque. The Topological Mu-Calculus: Completeness and Decidability. LICS ‘21: Proceedings of the 36th Annual ACM/IEEE Symposium on Logic in Computer Science, vol 89: 1-13, 2021. doi:10.1109/lics52264.2021.9470560. Available online in ILLC Prepublication (PP) series PP-2021-07.
-
In the book Das Kontinuum (1918), Hermann Weyl presents a coherent and sophisticated approach to analysis from a predicativist perspective. In the first chapter of (Weyl 1918), Weyl introduces a predicative concept of set, according to which sets are built ‘from the bottom up’ starting from the natural numbers. Weyl clearly contrasts this predicative concept of set with the concept of arbitrary set, which he finds wanting, especially when working with infinite sets. In the second chapter of Das Kontinuum, he goes on to show that large portions of 19th century analysis can be developed on the basis of his predicative concept of set.
Das Kontinuum inspired fundamental ideas in mathematical logic and beyond, such as the logical analysis of predicativity of the 1950-60’s, Solomon Feferman’s work on predicativity and Errett Bishop’s constructive mathematics. The seeds of Das Kontinuum are already visible in the early (Weyl 1910), where Weyl, among other things, offers a clarification of Zermelo’s axiom schema of Separation.
In this talk, I examine Weyl’s predicative concept of set in (Weyl 1918) and discuss its origins in (Weyl 1910).
Bibliography
- Weyl, H., 1910, Über die Definitionen der mathematischen Grundbegriffe, Mathematischnaturwissenschaftliche Blätter, 7, pp. 93-95 and pp. 109-113.
- Weyl, H., 1918, Das Kontinuum. Kritische Untersuchungen über die Grundlagen der Analysis, Veit, Leipzig. Translated in English, Dover Books on Mathematics, 2003.
-
Strict/tolerant logic, ST, has been of considerable interest in the last few years, in part because it forces consideration of what it means for two logics to be different, or the same. And thus, of what it means to be a logic. The basic idea behind ST is that it evaluates the premises and the conclusions of its consequence relation differently, with the premises held to stricter standards while conclusions are treated more tolerantly. More specifically, ST is a three-valued logic with left sides of sequents understood as if in Kleene’s Strong Three Valued Logic, and right sides as if in Priest’s Logic of Paradox. Surprisingly, this hybrid validates the same sequents that classical logic does, though it differs from classical logic at the metaconsequence level. A version of this result has been extended to meta, metameta , etc. consequence levels, creating a very interesting hierarchy of logics. All this is work of others, and I will summarize it.
My contribution to the subject is to show that the original ideas behind ST are, in fact, much more general than it first seemed, and an infinite family of many valued logics have Strict/Tolerant counterparts. Besides classical logic, this family includes both Kleene’s and Priest’s logics themselves, as well as first degree entailment. For instance, for both the Kleene and the Priest logic, the corresponding strict/tolerant logic is six-valued, but with differing sets of strictly and tolerantly designated truth values. There is a reverse notion, of Tolerant/Strict logics, which exist for the same structures. And the hierarchy going through meta, metameta, \ldots consequence levels exists for the same infinite family of many valued logics. In a similar way all this work extends to modal and quantified many valued logics. In brief, we have here a very general phenomenon.
I will present a sketch of the basic generalizations, of Strict/Tolerant and Tolerant/Strict, but I will not have time to discuss the hierarchies of such logics, nor will I have time to give proofs, beyond a basic sketch of the ideas involved.
-
The study of Gödel’s shorthand notebooks in the ERC project GODELIANA has revealed two main aspects of his work: First, there is behind each of his relatively few published papers an enormous amount of notes. They typically reveal how he arrived at his results, as with the incompleteness theorem. An extreme case are Gödel’s 1938 results on set theory, preceded by some 700 pages of notes never studied before. Secondly, his threshold for publishing was by 1940 so high that some two thousand pages of developments and results remained unknown. One highlight here is a series of notebooks titled “Resultate Grundlagen” in which numerous anticipations of later results in set theory are found. A second main topic are the relations between modal and intuitionistic logic. In particular, Gödel gives a syntactic translation from S4 to intuitionistic logic by methods that are readily generalizable to any decent intermediate logics. These newly discovered methods are, even by today’s standards, a clear step ahead in the study of interrelations between systems of non-classical logics.
-
Aristotle famously claimed that the only coherent form of infinity is potential, not actual. However many objects there are, it is possible for there to be yet more; but it is impossible for there in fact to be infinitely many objects. Although this view was superseded by Cantor’s transfinite set theory, even Cantor regarded the collection of all sets as “unfinished” or incapable of “being together”. In recent years, there has been a revival of interest in potentialist approaches to the philosophy and foundations of mathematics. The lecture provides a survey of such approaches, covering both technical results and associated philosophical views, as these emerge both in published work and in work in progress.
-
Progressions of theories along paths through Kleene’s Omega adding the consistency of the previous theory at every successor step, can deduce every true \( \Pi^0_1 \)-statement. This was shown by Turing in his 1938 thesis who called these progressions “ordinal logics”. In 1962 Feferman proved the amazing theorem that progressions based on the “uniform reflection principle” can deduce every true arithmetic statement. In contrast to Turing’s, Feferman’s proof is very complicated, involving several cunning applications of self-reference via the recursion theorem. Using Schütte’s method of search trees (or decomposition trees) for omega-logic and reflexive induction, however, one can give a rather transparent proof.
-
The practice of foundations of mathematics is built around a firm distinction between syntax and semantics. But how stable is this distinction, and is it always the case that semantically presented mathematical objects in the form e.g. of a model class might give rise to a “natural logic”? In this talk I will investigate different scenarios from set and model theory in which an investigation of the notion of an implicit or internal logic or syntax becomes possible. Time permitting we will also discuss the question whether logics without a syntax can be considered logics at all.
-
The first part will be about representation of mathematics on a computer. Questions that arise there are naturally reminiscent of issues that arise when teaching formal proofs in a basic logic course, e.g. how to deal with free and bound variables, and instantiation rules. As discussed in a 1962 paper of Tarski, these issues are “clearly experienced both in teaching an elementary course in mathematical logic and in formalizing the syntax of predicate logic for some theoretical purposes.” I will present two quite different approaches to this problem: one inspired by Tarski’s paper (N. Megill, system Metamath) and one using dependent type theory (N.G. de Bruijn).
The second part will then try to explain how notations introduced by dependent type theory suggest new insights for old questions coming from Principia Mathematica (extensionality, reducibility axiom) through the notion of universe, introduced by Grothendieck for representing category theory in set theory, and introduced in dependent type theory by P. Martin-Löf.
-
Reasoning with generalized quantifiers in natural language combines logical and arithmetical features, transcending divides between qualitative and quantitative. This practice blends with inference patterns in ‘grassroots mathematics’ such as pigeon-hole principles. Our topic is this cooperation of logic and counting on a par, studied with small systems and gradually moving upward.
We start with monadic first-order logic with counting. We provide normal forms that allow for axiomatization, determine which arithmetical notions are definable, and conversely, discuss which logical notions and reasoning principles can be defined out of arithmetical ones. Next we study a series of strengthenings in the same style, including second-order versions, systems with multiple counting, and a new modal logic with counting. As a complement to our fragment approach, we also discuss another way of controlling complexity: changing the semantics of counting to reason about ‘mass’ or other aggregating notions than cardinalities. Finally, we return to the basic reasoning practices that lie embedded in natural language, confronting our formal systems with linguistic quantifier vocabulary, monotonicity reasoning, and procedural semantics via semantic automata. We conclude with some pointers to further entanglements of logic and counting in the metamathematics of formal systems, the philosophy of logic, and cognitive psychology. (Joint work with Thomas Icard)
Paper available at: https://eprints.illc.uva.nl/id/eprint/1813/1/Logic.Counting.pdf