Nordic Online Logic Seminar - page 3
An online seminar for logicians and logic aficionados worldwide.
The Nordic Online Logic Seminar (NOL Seminar) is a monthly seminar series initiated in 2021 presenting expository talks by logicians on topics of interest for the broader logic community. Initially the series focused on activities of the Nordic logic groups, but has since expanded to offer a variety of talks from logicians around the world. The seminar is open to professional or aspiring logicians and logic aficionados worldwide.
The tentative time slot is Monday, 16.00-17.30 (Stockholm/Sweden time). If you wish to receive the Zoom ID and password for it, as well as regular announcements, please subscribe to the NOL Seminar mailing list.
NOL seminar organisers
Valentin Goranko and Graham Leigh
-
Laura Crosilla (University of Oslo)
On Weyl's predicative concept of set
In the book Das Kontinuum (1918), Hermann Weyl presents a coherent and sophisticated approach to analysis from a predicativist perspective. In the first chapter of (Weyl 1918), Weyl introduces a predicative concept of set, according to which sets are built ‘from the bottom up’ starting from the natural numbers. Weyl clearly contrasts this predicative concept of set with the concept of arbitrary set, which he finds wanting, especially when working with infinite sets. In the second chapter of Das Kontinuum, he goes on to show that large portions of 19th century analysis can be developed on the basis of his predicative concept of set.
Das Kontinuum inspired fundamental ideas in mathematical logic and beyond, such as the logical analysis of predicativity of the 1950-60’s, Solomon Feferman’s work on predicativity and Errett Bishop’s constructive mathematics. The seeds of Das Kontinuum are already visible in the early (Weyl 1910), where Weyl, among other things, offers a clarification of Zermelo’s axiom schema of Separation.
In this talk, I examine Weyl’s predicative concept of set in (Weyl 1918) and discuss its origins in (Weyl 1910).
Bibliography
- Weyl, H., 1910, Über die Definitionen der mathematischen Grundbegriffe, Mathematischnaturwissenschaftliche Blätter, 7, pp. 93-95 and pp. 109-113.
- Weyl, H., 1918, Das Kontinuum. Kritische Untersuchungen über die Grundlagen der Analysis, Veit, Leipzig. Translated in English, Dover Books on Mathematics, 2003.
-
Melvin Fitting (City University of New York (Graduate Center))
Strict/Tolerant Logic and Strict/Tolerant Logics
Strict/tolerant logic, ST, has been of considerable interest in the last few years, in part because it forces consideration of what it means for two logics to be different, or the same. And thus, of what it means to be a logic. The basic idea behind ST is that it evaluates the premises and the conclusions of its consequence relation differently, with the premises held to stricter standards while conclusions are treated more tolerantly. More specifically, ST is a three-valued logic with left sides of sequents understood as if in Kleene’s Strong Three Valued Logic, and right sides as if in Priest’s Logic of Paradox. Surprisingly, this hybrid validates the same sequents that classical logic does, though it differs from classical logic at the metaconsequence level. A version of this result has been extended to meta, metameta , etc. consequence levels, creating a very interesting hierarchy of logics. All this is work of others, and I will summarize it.
My contribution to the subject is to show that the original ideas behind ST are, in fact, much more general than it first seemed, and an infinite family of many valued logics have Strict/Tolerant counterparts. Besides classical logic, this family includes both Kleene’s and Priest’s logics themselves, as well as first degree entailment. For instance, for both the Kleene and the Priest logic, the corresponding strict/tolerant logic is six-valued, but with differing sets of strictly and tolerantly designated truth values. There is a reverse notion, of Tolerant/Strict logics, which exist for the same structures. And the hierarchy going through meta, metameta, \ldots consequence levels exists for the same infinite family of many valued logics. In a similar way all this work extends to modal and quantified many valued logics. In brief, we have here a very general phenomenon.
I will present a sketch of the basic generalizations, of Strict/Tolerant and Tolerant/Strict, but I will not have time to discuss the hierarchies of such logics, nor will I have time to give proofs, beyond a basic sketch of the ideas involved.
-
Jan von Plato (University of Helsinki)
Gödel's work in logic and foundations in the light of his shorthand notebooks
The study of Gödel’s shorthand notebooks in the ERC project GODELIANA has revealed two main aspects of his work: First, there is behind each of his relatively few published papers an enormous amount of notes. They typically reveal how he arrived at his results, as with the incompleteness theorem. An extreme case are Gödel’s 1938 results on set theory, preceded by some 700 pages of notes never studied before. Secondly, his threshold for publishing was by 1940 so high that some two thousand pages of developments and results remained unknown. One highlight here is a series of notebooks titled “Resultate Grundlagen” in which numerous anticipations of later results in set theory are found. A second main topic are the relations between modal and intuitionistic logic. In particular, Gödel gives a syntactic translation from S4 to intuitionistic logic by methods that are readily generalizable to any decent intermediate logics. These newly discovered methods are, even by today’s standards, a clear step ahead in the study of interrelations between systems of non-classical logics.
-
Øystein Linnebo (University of Oslo)
Potentialism in the philosophy and foundations of mathematics
Aristotle famously claimed that the only coherent form of infinity is potential, not actual. However many objects there are, it is possible for there to be yet more; but it is impossible for there in fact to be infinitely many objects. Although this view was superseded by Cantor’s transfinite set theory, even Cantor regarded the collection of all sets as “unfinished” or incapable of “being together”. In recent years, there has been a revival of interest in potentialist approaches to the philosophy and foundations of mathematics. The lecture provides a survey of such approaches, covering both technical results and associated philosophical views, as these emerge both in published work and in work in progress.
-
Michael Rathjen (University of Leeds)
Completeness: Turing, Schütte, Feferman (and Löb)
Progressions of theories along paths through Kleene’s Omega adding the consistency of the previous theory at every successor step, can deduce every true \( \Pi^0_1 \)-statement. This was shown by Turing in his 1938 thesis who called these progressions “ordinal logics”. In 1962 Feferman proved the amazing theorem that progressions based on the “uniform reflection principle” can deduce every true arithmetic statement. In contrast to Turing’s, Feferman’s proof is very complicated, involving several cunning applications of self-reference via the recursion theorem. Using Schütte’s method of search trees (or decomposition trees) for omega-logic and reflexive induction, however, one can give a rather transparent proof.
-
Juliette Kennedy (University of Helsinki)
Reading syntax off semantics
The practice of foundations of mathematics is built around a firm distinction between syntax and semantics. But how stable is this distinction, and is it always the case that semantically presented mathematical objects in the form e.g. of a model class might give rise to a “natural logic”? In this talk I will investigate different scenarios from set and model theory in which an investigation of the notion of an implicit or internal logic or syntax becomes possible. Time permitting we will also discuss the question whether logics without a syntax can be considered logics at all.
-
Thierry Coquand (Gothenburg University)
Formalization of Mathematics and Dependent Type Theory
The first part will be about representation of mathematics on a computer. Questions that arise there are naturally reminiscent of issues that arise when teaching formal proofs in a basic logic course, e.g. how to deal with free and bound variables, and instantiation rules. As discussed in a 1962 paper of Tarski, these issues are “clearly experienced both in teaching an elementary course in mathematical logic and in formalizing the syntax of predicate logic for some theoretical purposes.” I will present two quite different approaches to this problem: one inspired by Tarski’s paper (N. Megill, system Metamath) and one using dependent type theory (N.G. de Bruijn).
The second part will then try to explain how notations introduced by dependent type theory suggest new insights for old questions coming from Principia Mathematica (extensionality, reducibility axiom) through the notion of universe, introduced by Grothendieck for representing category theory in set theory, and introduced in dependent type theory by P. Martin-Löf.
-
Johan van Benthem (University of Amsterdam, Stanford University, and Tsinghua University)
Interleaving Logic and Counting
Reasoning with generalized quantifiers in natural language combines logical and arithmetical features, transcending divides between qualitative and quantitative. This practice blends with inference patterns in ‘grassroots mathematics’ such as pigeon-hole principles. Our topic is this cooperation of logic and counting on a par, studied with small systems and gradually moving upward.
We start with monadic first-order logic with counting. We provide normal forms that allow for axiomatization, determine which arithmetical notions are definable, and conversely, discuss which logical notions and reasoning principles can be defined out of arithmetical ones. Next we study a series of strengthenings in the same style, including second-order versions, systems with multiple counting, and a new modal logic with counting. As a complement to our fragment approach, we also discuss another way of controlling complexity: changing the semantics of counting to reason about ‘mass’ or other aggregating notions than cardinalities. Finally, we return to the basic reasoning practices that lie embedded in natural language, confronting our formal systems with linguistic quantifier vocabulary, monotonicity reasoning, and procedural semantics via semantic automata. We conclude with some pointers to further entanglements of logic and counting in the metamathematics of formal systems, the philosophy of logic, and cognitive psychology. (Joint work with Thomas Icard)
Paper available at: https://eprints.illc.uva.nl/id/eprint/1813/1/Logic.Counting.pdf
-
Lars Birkedal (Aarhus University)
Iris: A Higher-Order Concurrent Separation Logic Framework
I will introduce some of our research on Iris, a higher-order concurrent separation logic framework, implemented and verified in the Coq proof assistant, which can be used for mathematical reasoning about safety and correctness of concurrent higher-order imperative programs. Iris has been used for many different applications; see iris-project.org for a list of research papers. However, in this talk I will focus on the Iris base logic (and its semantics) and sketch how one may define useful program logics on top of the base logic. The base logic is a higher-order intuitionistic modal logic, which, in particular, supports the definition of recursive predicates and whose type of propositions is itself recursively defined.
-
Sara L. Uckelman (Durham University)
John Eliot’s Logick Primer: A bilingual English-Wôpanâak logic textbook
In 1672 John Eliot, English Puritan educator and missionary, published The Logick Primer: Some Logical Notions to initiate the INDIANS in the knowledge of the Rule of Reason; and to know how to make use thereof [1]. This roughly 80 page pamphlet focuses on introducing basic syllogistic vocabulary and reasoning so that syllogisms can be created from texts in the Psalms, the gospels, and other New Testament books. The use of logic for proselytizing purposes is not distinctive: What is distinctive about Eliot’s book is that it is bilingual, written in both English and Wôpanâak (Massachusett), an Algonquian language spoken in eastern coastal and southeastern Massachusetts. It is one of the earliest bilingual logic textbooks, it is the only textbook that I know of in an indigenous American language, and it is one of the earliest printed attestations of the Massachusett language.
In this talk, I will:
- Introduce John Eliot and the linguistic context he was working in.
- Introduce the contents of the Logick Primer—vocabulary, inference patterns, and applications.
- Discuss notions of “Puritan” logic that inform this primer.
- Talk about the importance of his work in documenting and expanding the Massachusett language and the problems that accompany his colonial approach to this work.
References
[1] J.[ohn] E.[liot]. The Logick Primer: Some Logical Notions to initiate the INDIANS in the knowledge of the Rule of Reason; and to know how to make use thereof. Cambridge, MA: Printed by M.[armaduke] J.[ohnson], 1672.