# Publications

### David Binder & Thomas Piecha, Popper’s Notion of Duality and His Theory of Negations

History and Philosophy of Logic 38(2), pp. 154-189. DOI: 10.1080/01445340.2016.1278517. Published online: 25 Jan 2017. (Draft)

**Abstract:** Karl Popper developed a theory of deductive logic in the late 1940s. In his approach, logic is a metalinguistic theory of deducibility relations that are based on certain purely structural rules. Logical constants are then characterized in terms of deducibility relations. Characterizations of this kind are also called *inferential definitions* by Popper. In this paper, we expound his theory and elaborate some of his ideas and results that in some cases were only sketched by him. Our focus is on Popper’s notion of duality, his theory of modalities, and his treatment of different kinds of negation. This allows us to show how his works on logic anticipate some later developments and discussions in philosophical logic, pertaining to trivializing (*tonk*-like) connectives, the duality of logical constants, dual-intuitionistic logic, the (non-)conservativeness of language extensions, the existence of a bi-intuitionistic logic, the non-logicality of minimal negation, and to the problem of logicality in general.

### Jean Fichot & Thomas Piecha (eds), Beyond Logic. Proceedings of the Conference held in Cerisy-la-Salle, 22-27 May 2017, URI: http://dx.doi.org/10.15496/publikation-18676, Paris 1 Panthéon-Sorbonne and University of Tübingen 2017.

**Abstract:** The project "Beyond Logic" is devoted to what hypothetical reasoning is all about when we go beyond the realm of "pure" logic into the world where logic is applied. As such extralogical areas we have chosen philosophy of science as an application within philosophy, informatics as an application within the formal sciences, and law as an application within the field of social interaction. The aim of the conference was to allow philosophers, logicians and computer scientists to present their work in connection with these three areas. The conference took place 22-27 May, 2017 in Cerisy-la-Salle at the Centre Culturel International de Cerisy. The proceedings collect abstracts, slides and papers of the presentations given, as well as a contribution from a speaker who was unable to attend.

### Lev Gordeev & Edward Hermann Haeusler, Proof Compression and NP Versus PSPACE

Studia Logica 107, 2019. Special issue on General Proof Theory, edited by T. Piecha & P. Schroeder-Heister.

### Jean-Baptiste Joinet, Nature et logique, de G.Gentzen à J-Y.Girard

Logique et Analyse 59(234), pp. 157-171, 2016.

**Abstract:** La conception de la logique de J.-Y. Girard s’enracine dans la critique de la naturalité logique opérée par G. Gentzen dans les années 1930. Récemment, Girard a radicalisé cette critique et propos ́e une entreprise de refondation de la logique comme produit d’une théorie générale de l’interaction. La présente étude vise à montrer que cette entreprise va dans le sens d’une réunification de l’idée de logique naturelle et de celle de logique de la nature, via la notion de calcul.

### Pierre-Éric Mounier-Kuhn & Maël Pégny, AFCAL and the Emergence of Computer Science in France: 1957-1967

In: A. Beckmann, L. Bienvenu & N. Jonoska (eds), Pursuit of the Universal. CiE 2016. LNCS 9709, Springer, pp. 170-181.

**Abstract:** Founded in 1957, the *Association Française de Calcul* (AFCAL) was the first French society dedicated mainly to numerical computation. Its rapid growth and amalgamation with sister societies in related fields (Operations Research, Automatic Control) in the 1960s resulted in changes of its name and purpose, including the invention and adoption of the term *informatique* in 1962–1964, then in the adoption of *cybernétique* in 1967. Our paper aims at explicating the motives of its creation, its evolving definition and the functions it fulfilled. We seek to understand how this association, altogether a learned and a professional society, contributed to the emergence and recognition of Computing as an academic discipline in France. The main sources are the scattered surviving records of AFCAL, conserved in the archives of the Observatoire de Paris, of the *Institut de Mathématiques appliquées de Grenoble* (IMAG) and of the CNRS’ *Institut Blaise Pascal* in Paris, as well as AFCAL's first congress and journal, *Chiffres*.

### Alberto Naibo, Putnam-Dummett. Quelle logique pour quel réalisme ?

Archives de Philosophie 2016/4 (Tome 79), pp. 693-720.

**Abstract** (F): Cet article compare les positions de Putnam et de Dummett sur la question de la signification et de la connaissance des vérités logico-mathématiques. Cette comparaison permettra de trouver une unité dans la pensée putnamienne, caractérisée par la tentative de définition d’une position intermédiaire entre un réalisme dogmatique et un antiréalisme constructiviste. Cette position intermédiaire, qui peut être appelée « réalisme modéré », sera ensuite analysée d’un point de vue technique, grâce à l’interprétation sans-contre-exemple de Kreisel.

**Abstract** (EN): This article compares Putnam’s and Dummett’s positions with respect to the question of the meaning and knowledge of logico-mathematical truths. This comparison reveals an underlying unity in Putnam’s thought, characterized by the attempt to define an intermediate position between a dogmatic realism and a constructive anti-realism. This intermediate position, which can be called « moderate realism », will then be analyzed technically using Kreisel’s no-counter-example interpretation.

### Alberto Naibo, Mattia Petrolo & Thomas Seiller, On the Computational Meaning of Axioms

In: J. Redmond, O. Pombo Martins & Á. Nepomuceno Fernández (eds), Epistemology, Knowledge and the Impact of Interaction, Springer 2016, pp. 141-184.

**Abstract:** This paper investigates an anti-realist theory of meaning suitable for both logical and proper axioms. Unlike other anti-realist accounts such as Dummett–Prawitz verificationism, the standard framework of classical logic is not called into question. This account also admits semantic features beyond the inferential ones: computational aspects play an essential role in the determination of meaning. To deal with these computational aspects, a relaxation of syntax is necessary. This leads to a general kind of proof theory, where the objects of study are not typed objects like deductions, but rather untyped ones, in which formulas are replaced by geometrical configurations.

### Alberto Naibo, Mattia Petrolo & Thomas Seiller, Verificationism and Classical Realizability

In: C. Başkent (ed), Perspectives on Interrogative Models of Inquiry, Springer 2016, pp. 163-197. (Draft)

**Abstract:** This paper investigates the question of whether Krivine’s classical realizability can provide a verificationist interpretation of classical logic. We argue that this kind of realizability can be considered an adequate candidate for this semantic role, provided that the notion of verification involved is no longer based on proofs, but on programs. On this basis, we show that a special reading of classical realizability is compatible with a verificationist theory of meaning, insofar as pure logic is concerned. Crucially, in order to remain faithful to a fundamental verificationist tenet, we show that classical realizability can be understood from a single-agent perspective, thus avoiding the usual game-theoretic interpretation involving at least two players.

### Maël Pégny, How to Make a Meaningful Comparison of Models: The Church-Turing Thesis Over the Reals

Minds and Machines, 26 (4), 359-388 (December 2 2016)

**Abstract:** It is commonly believed that there is no equivalent of the Church–Turing thesis for computation over the reals. In particular, computational models on this domain do not exhibit the convergence of formalisms that supports this thesis in the case of integer computation. In the light of recent philosophical developments on the different meanings of the Church–Turing thesis, and recent technical results on analog computation, I will show that this current belief confounds two distinct issues, namely the extension of the notion of effective computation to the reals on the one hand, and the simulation of analog computers by Turing machines on the other hand. I will argue that it is possible in both cases to defend an equivalent of the Church–Turing thesis over the reals. Along the way, we will learn some methodological caveats on the comparison of different computational models, and how to make it meaningful.

### Mattia Petrolo & Paolo Pistone, A normal paradox

In: P. Arazim & T. Lávička (eds), *The Logica Yearbook 2016*, College Publications, London, 2017, pp. 173-184.

**Abstract:** We challenge the idea that the lack of a normalization procedure can be taken as a distinctive feature of paradoxes. In particular, we present a counterexample to Tennant's proof-theoretic characterization of paradoxes in intuitionistic natural deduction. Our counterexample follows a simple technique for eliminating cuts introduced by Kreisel and Takeuti.

### Thomas Piecha, Completeness in Proof-Theoretic Semantics

In: T. Piecha & P. Schroeder-Heister (eds), Advances in Proof-Theoretic Semantics, Trends in Logic 43, Springer 2016, pp. 231-251.

**Abstract:** We give an overview of completeness and incompleteness results within proof-theoretic semantics. Completeness of intuitionistic first-order logic for certain notions of validity in proof-theoretic semantics has been conjectured by Prawitz. For the kind of semantics proposed by him, this conjecture is still undecided. For certain variants of proof-theoretic semantics the completeness question is settled, including a positive result for classical logic. For intuitionistic logic there are positive as well as negative completeness results, depending on which variant of semantics is considered. Further results have been obtained for certain fragments of first-order languages.

### Thomas Piecha & Peter Schroeder-Heister (eds), Advances in Proof-Theoretic Semantics, Trends in Logic 43, Springer 2016.

This volume is the first ever collection devoted to the field of proof-theoretic semantics. Contributions address topics including the systematics of introduction and elimination rules and proofs of normalization, the categorial characterization of deductions, the relation between Heyting's and Gentzen's approaches to meaning, knowability paradoxes, proof-theoretic foundations of set theory, Dummett's justification of logical laws, Kreisel's theory of constructions, paradoxical reasoning, and the defence of model theory.

The field of proof-theoretic semantics has existed for almost 50 years, but the term itself was proposed by Schroeder-Heister in the 1980s. Proof-theoretic semantics explains the meaning of linguistic expressions in general and of logical constants in particular in terms of the notion of proof. This volume emerges from presentations at the Second International Conference on Proof-Theoretic Semantics in Tübingen in 2013, where contributing authors were asked to provide a self-contained description and analysis of a significant research question in this area. The contributions are representative of the field and should be of interest to logicians, philosophers, and mathematicians alike.

Book review by Greg Restall in *Notre Dame Philosophical Reviews*

### Thomas Piecha & Peter Schroeder-Heister, Atomic systems in Proof-Theoretic Semantics: Two Approaches

In: J. Redmond, O. Pombo Martins & Á. Nepomuceno Fernández (eds), Epistemology, Knowledge and the Impact of Interaction, Springer 2016, pp. 47-62. (Draft)

**Abstract:** Atomic systems are systems of rules containing only atomic formulas. In proof-theoretic semantics for minimal and intuitionistic logic they are used as the base case in an inductive definition of validity. We compare two different approaches to atomic systems. The first approach is compatible with an interpretation of atomic systems as representations of states of knowledge. The second takes atomic systems to be definitions of atomic formulas. The two views lead to different notions of derivability for atomic formulas, and consequently to different notions of proof-theoretic validity. In the first approach, validity is stable in the sense that for atomic formulas logical consequence and derivability coincide for any given atomic system. In the second approach this is not the case. This indicates that atomic systems as definitions, which determine the meaning of atomic sentences, might not be the proper basis for proof-theoretic validity, or conversely, that standard notions of proof-theoretic validity are not appropriate for definitional rule systems.

### Thomas Piecha & Peter Schroeder-Heister (eds), General Proof Theory. Celebrating 50 Years of Dag Prawitz's "Natural Deduction". Proceedings of the Conference held in Tübingen, 27-29 November 2015, URI: http://dx.doi.org/10.15496/publikation-10394, University of Tübingen 2016.

**Abstract:** General proof theory studies how proofs are structured and how they relate to each other, and not primarily what can be proved in particular formal systems. It has been developed within the framework of Gentzen-style proof theory, as well as in categorial proof theory. As Dag Prawitz's monograph "Natural Deduction" (1965) paved the way for this development (he also proposed the term "General Proof Theory"), it is most appropriate to use this topic to celebrate 50 years of this work. The conference took place 27-29 November, 2015 in Tübingen at the Department of Philosophy. The proceedings collect abstracts, slides and papers of the presentations given, as well as contributions from two speakers who were unable to attend.

### Thomas Piecha & Peter Schroeder-Heister, The Definitional View of Atomic Systems in Proof-Theoretic Semantics

In: P. Arazim & T. Lávička (eds), *The Logica Yearbook 2016*, College Publications, London, 2017, pp. 185-200.

**Abstract:** Atomic systems, that is, sets of rules containing only atomic formulas, play an important role in proof-theoretic notions of logical validity. We consider a view of atomic systems as definitions that allows us to discuss a proposal made by Prawitz (2016). The implementation of this view in the base case of an inductive definition of validity leads to the problem that derivability of atomic formulas in an atomic system does not coincide with the validity of these formulas. This is due to the fact that, on the definitional view of atomic systems, there are not just production rules, but both introduction and elimination rules for atoms, which may even generate non-normalizable atomic derivations. This shows that the way atomic systems are handled is a fundamental issue of proof-theoretic semantics.

### Peter Schroeder-Heister, Open problems in proof-theoretic semantics

In: T. Piecha & P. Schroeder-Heister (eds), Advances in Proof-Theoretic Semantics, Trends in Logic 43, Springer 2016, pp. 253-283.

**Abstract:** I present three open problems the discussion and solution of which I consider relevant for the further development of proof-theoretic semantics: (1) The nature of hypotheses and the problem of the appropriate format of proofs, (2) the problem of a satisfactory notion of proof-theoretic harmony, and (3) the problem of extending methods of proof-theoretic semantics beyond logic.

### Peter Schroeder-Heister & Luca Tranchini, Ekman's Paradox

Notre Dame Journal of Formal Logic, 2017. doi:10.1215/00294527-2017-0017. (Draft)

**Abstract:** Prawitz observed that Russell’s paradox in naive set theory yields a derivation of absurdity whose reduction sequence loops. Building on this observation, and based on numerous examples, Tennant claimed that this looping feature, or more generally, the fact that derivations of absurdity do not normalize, is characteristic of the paradoxes. Striking results by Ekman show that looping reduction sequences are already obtained in minimal propositional logic, when certain reduction steps, which are prima facie plausible, are considered in addition to the standard ones. This shows that the notion of reduction is in need of clarification. Referring to the notion of identity of proofs in general proof theory, we argue that reduction steps should not merely remove redundancies, but must respect the identity of proofs. Consequentially, we propose to modify Tennant’s paradoxicality test by basing it on this refined notion of reduction.