# Publications

##
David Binder & Thomas Piecha, Popper’s Notion of Duality and His Theory of Negations

In: History and Philosophy of Logic 38(2), pp. 154-189. DOI: 10.1080/01445340.2016.1278517. Published online: 25 Jan 2017. (Draft)

**Abstract:** Karl Popper developed a theory of deductive logic in the late 1940s. In his approach, logic is a metalinguistic theory of deducibility relations that are based on certain purely structural rules. Logical constants are then characterized in terms of deducibility relations. Characterizations of this kind are also called *inferential definitions* by Popper. In this paper, we expound his theory and elaborate some of his ideas and results that in some cases were only sketched by him. Our focus is on Popper’s notion of duality, his theory of modalities, and his treatment of different kinds of negation. This allows us to show how his works on logic anticipate some later developments and discussions in philosophical logic, pertaining to trivializing (*tonk*-like) connectives, the duality of logical constants, dual-intuitionistic logic, the (non-)conservativeness of language extensions, the existence of a bi-intuitionistic logic, the non-logicality of minimal negation, and to the problem of logicality in general.

## Jean Fichot & Thomas Piecha (eds), Beyond Logic. Proceedings of the Conference held in Cerisy-la-Salle, 22-27 May 2017, URI: http://dx.doi.org/10.15496/publikation-18676, Paris 1 Panthéon-Sorbonne and University of Tübingen 2017

**Abstract:** The project "Beyond Logic" is devoted to what hypothetical reasoning is all about when we go beyond the realm of "pure" logic into the world where logic is applied. As such extralogical areas we have chosen philosophy of science as an application within philosophy, informatics as an application within the formal sciences, and law as an application within the field of social interaction. The aim of the conference was to allow philosophers, logicians and computer scientists to present their work in connection with these three areas. The conference took place 22-27 May, 2017 in Cerisy-la-Salle at the Centre Culturel International de Cerisy. The proceedings collect abstracts, slides and papers of the presentations given, as well as a contribution from a speaker who was unable to attend.

##
Lev Gordeev & Edward Hermann Haeusler, Proof Compression and NP Versus PSPACE

In: Studia Logica 107, 2019. Special issue on General Proof Theory, edited by T. Piecha & P. Schroeder-Heister.

##
Jean-Baptiste Joinet, Nature et logique, de G. Gentzen à J-Y.Girard

In: Logique et Analyse 59(234), pp. 157-171, 2016.

**Abstract:** La conception de la logique de J.-Y. Girard s’enracine dans la critique de la naturalité logique opérée par G. Gentzen dans les années 1930. Récemment, Girard a radicalisé cette critique et propos ́e une entreprise de refondation de la logique comme produit d’une théorie générale de l’interaction. La présente étude vise à montrer que cette entreprise va dans le sens d’une réunification de l’idée de logique naturelle et de celle de logique de la nature, via la notion de calcul.

##
Pierre-Éric Mounier-Kuhn & Maël Pégny, AFCAL and the Emergence of Computer Science in France: 1957-1967

In: A. Beckmann, L. Bienvenu & N. Jonoska (eds), Pursuit of the Universal. CiE 2016. LNCS 9709, Springer, pp. 170-181.

**Abstract:** Founded in 1957, the *Association Française de Calcul* (AFCAL) was the first French society dedicated mainly to numerical computation. Its rapid growth and amalgamation with sister societies in related fields (Operations Research, Automatic Control) in the 1960s resulted in changes of its name and purpose, including the invention and adoption of the term *informatique* in 1962–1964, then in the adoption of *cybernétique* in 1967. Our paper aims at explicating the motives of its creation, its evolving definition and the functions it fulfilled. We seek to understand how this association, altogether a learned and a professional society, contributed to the emergence and recognition of Computing as an academic discipline in France. The main sources are the scattered surviving records of AFCAL, conserved in the archives of the Observatoire de Paris, of the *Institut de Mathématiques appliquées de Grenoble* (IMAG) and of the CNRS’ *Institut Blaise Pascal* in Paris, as well as AFCAL's first congress and journal, *Chiffres*.

##
Alberto Naibo, Putnam-Dummett. Quelle logique pour quel réalisme ?

In: Archives de Philosophie 2016/4 (Tome 79), pp. 693-720.

**Résumé :** Cet article compare les positions de Putnam et de Dummett sur la question de la signification et de la connaissance des vérités logico-mathématiques. Cette comparaison permettra de trouver une unité dans la pensée putnamienne, caractérisée par la tentative de définition d’une position intermédiaire entre un réalisme dogmatique et un antiréalisme constructiviste. Cette position intermédiaire, qui peut être appelée « réalisme modéré », sera ensuite analysée d’un point de vue technique, grâce à l’interprétation sans-contre-exemple de Kreisel.

**Abstract:** This article compares Putnam’s and Dummett’s positions with respect to the question of the meaning and knowledge of logico-mathematical truths. This comparison reveals an underlying unity in Putnam’s thought, characterized by the attempt to define an intermediate position between a dogmatic realism and a constructive anti-realism. This intermediate position, which can be called « moderate realism », will then be analyzed technically using Kreisel’s no-counter-example interpretation.

##
Alberto Naibo, Mattia Petrolo & Thomas Seiller, On the Computational Meaning of Axioms

In: J. Redmond, O. Pombo Martins & Á. Nepomuceno Fernández (eds), Epistemology, Knowledge and the Impact of Interaction, Springer 2016, pp. 141-184.

**Abstract:** This paper investigates an anti-realist theory of meaning suitable for both logical and proper axioms. Unlike other anti-realist accounts such as Dummett–Prawitz verificationism, the standard framework of classical logic is not called into question. This account also admits semantic features beyond the inferential ones: computational aspects play an essential role in the determination of meaning. To deal with these computational aspects, a relaxation of syntax is necessary. This leads to a general kind of proof theory, where the objects of study are not typed objects like deductions, but rather untyped ones, in which formulas are replaced by geometrical configurations.

##
Alberto Naibo, Mattia Petrolo & Thomas Seiller, Verificationism and Classical Realizability

In: C. Başkent (ed), Perspectives on Interrogative Models of Inquiry, Springer 2016, pp. 163-197. (Draft)

**Abstract:** This paper investigates the question of whether Krivine’s classical realizability can provide a verificationist interpretation of classical logic. We argue that this kind of realizability can be considered an adequate candidate for this semantic role, provided that the notion of verification involved is no longer based on proofs, but on programs. On this basis, we show that a special reading of classical realizability is compatible with a verificationist theory of meaning, insofar as pure logic is concerned. Crucially, in order to remain faithful to a fundamental verificationist tenet, we show that classical realizability can be understood from a single-agent perspective, thus avoiding the usual game-theoretic interpretation involving at least two players.

##
Maël Pégny, How to Make a Meaningful Comparison of Models: The Church-Turing Thesis Over the Reals

In: Minds and Machines, 26(4), pp. 359-388 (December 2 2016)

**Abstract:** It is commonly believed that there is no equivalent of the Church–Turing thesis for computation over the reals. In particular, computational models on this domain do not exhibit the convergence of formalisms that supports this thesis in the case of integer computation. In the light of recent philosophical developments on the different meanings of the Church–Turing thesis, and recent technical results on analog computation, I will show that this current belief confounds two distinct issues, namely the extension of the notion of effective computation to the reals on the one hand, and the simulation of analog computers by Turing machines on the other hand. I will argue that it is possible in both cases to defend an equivalent of the Church–Turing thesis over the reals. Along the way, we will learn some methodological caveats on the comparison of different computational models, and how to make it meaningful.

##
Maël Pégny & Mohamed Issam Ibnouhsein, Quelle transparence pour les algorithmes d'apprentissage machine ?

In: Revue d'Intelligence Artificielle, 32(4), pp. 447-478 (Décembre 2018)

**Résumé :** La notion de « transparence des algorithmes » a récemment pris une grande importance à la fois dans le débat public et dans le débat scientifique. Partant de la prolifération des emplois du terme « transparence » nous distinguons deux familles d'usages fondamentaux du concept : une famille descriptive portant sur des propriétés épistémiques intrinsèques des programmes, au premier rang desquels l'intelligibilité et l'explicabilité, et une famille prescriptive portant sur des propriétés normatives de leurs usages, au premier rang desquels la loyauté et l'équité. Parce qu'il faut comprendre un algorithme pour l'expliquer et en réaliser l'audit, l'intelligibilité est logiquement première dans l'étude philosophique de la transparence. Afin de mieux cerner les enjeux de l'intelligibilité dans l'emploi public des algorithmes, nous introduisons dans un deuxième temps une distinction entre intelligibilité de la procédure et intelligibilité des sorties. Dans un dernier temps, nous appliquons cette distinction au cas particulier de l'apprentissage machine.

**Abstract:** Recently, the concept of "transparency of algorithms" has become of primary importance in the public and scientific debates. In the light of the proliferation of uses of the term "transparency", we distinguish two families of fundamental uses of the concept: a descriptive family relating to intrinsic epistemic properties of programs, the first of which are intelligibility and explicability, and a prescriptive family that concerns the normative properties of their uses, the first of which are loyalty and fairness. Because one needs to understand an algorithm in order to explain it and carry out its audit, intelligibility is logically first in the philosophical study of transparency. In order to better determine the challenges of intelligibility in the public use of algorithms, we introduce a distinction between the intelligibility of the procedure and the intelligibility of outputs. Finally, we apply this distinction to the case of machine learning.

##
Mattia Petrolo & Paolo Pistone, A normal paradox

In: P. Arazim & T. Lávička (eds), *The Logica Yearbook 2016*, College Publications, London, 2017, pp. 173-184.

**Abstract:** We challenge the idea that the lack of a normalization procedure can be taken as a distinctive feature of paradoxes. In particular, we present a counterexample to Tennant's proof-theoretic characterization of paradoxes in intuitionistic natural deduction. Our counterexample follows a simple technique for eliminating cuts introduced by Kreisel and Takeuti.

##
Thomas Piecha, Completeness in Proof-Theoretic Semantics

In: T. Piecha & P. Schroeder-Heister (eds), Advances in Proof-Theoretic Semantics, Trends in Logic 43, Springer 2016, pp. 231-251.

**Abstract:** We give an overview of completeness and incompleteness results within proof-theoretic semantics. Completeness of intuitionistic first-order logic for certain notions of validity in proof-theoretic semantics has been conjectured by Prawitz. For the kind of semantics proposed by him, this conjecture is still undecided. For certain variants of proof-theoretic semantics the completeness question is settled, including a positive result for classical logic. For intuitionistic logic there are positive as well as negative completeness results, depending on which variant of semantics is considered. Further results have been obtained for certain fragments of first-order languages.

##
Thomas Piecha & Peter Schroeder-Heister (eds), Advances in Proof-Theoretic Semantics, Trends in Logic 43, Springer 2016.

This volume is the first ever collection devoted to the field of proof-theoretic semantics. Contributions address topics including the systematics of introduction and elimination rules and proofs of normalization, the categorial characterization of deductions, the relation between Heyting's and Gentzen's approaches to meaning, knowability paradoxes, proof-theoretic foundations of set theory, Dummett's justification of logical laws, Kreisel's theory of constructions, paradoxical reasoning, and the defence of model theory.

The field of proof-theoretic semantics has existed for almost 50 years, but the term itself was proposed by Schroeder-Heister in the 1980s. Proof-theoretic semantics explains the meaning of linguistic expressions in general and of logical constants in particular in terms of the notion of proof. This volume emerges from presentations at the Second International Conference on Proof-Theoretic Semantics in Tübingen in 2013, where contributing authors were asked to provide a self-contained description and analysis of a significant research question in this area. The contributions are representative of the field and should be of interest to logicians, philosophers, and mathematicians alike.

Book review by Greg Restall in *Notre Dame Philosophical Reviews*

##
Thomas Piecha & Peter Schroeder-Heister, Atomic systems in Proof-Theoretic Semantics: Two Approaches

In: J. Redmond, O. Pombo Martins & Á. Nepomuceno Fernández (eds), Epistemology, Knowledge and the Impact of Interaction, Springer 2016, pp. 47-62. (Draft)

**Abstract:** Atomic systems are systems of rules containing only atomic formulas. In proof-theoretic semantics for minimal and intuitionistic logic they are used as the base case in an inductive definition of validity. We compare two different approaches to atomic systems. The first approach is compatible with an interpretation of atomic systems as representations of states of knowledge. The second takes atomic systems to be definitions of atomic formulas. The two views lead to different notions of derivability for atomic formulas, and consequently to different notions of proof-theoretic validity. In the first approach, validity is stable in the sense that for atomic formulas logical consequence and derivability coincide for any given atomic system. In the second approach this is not the case. This indicates that atomic systems as definitions, which determine the meaning of atomic sentences, might not be the proper basis for proof-theoretic validity, or conversely, that standard notions of proof-theoretic validity are not appropriate for definitional rule systems.

## Thomas Piecha & Peter Schroeder-Heister (eds), General Proof Theory. Celebrating 50 Years of Dag Prawitz's "Natural Deduction". Proceedings of the Conference held in Tübingen, 27-29 November 2015, URI: http://dx.doi.org/10.15496/publikation-10394, University of Tübingen 2016

**Abstract:** General proof theory studies how proofs are structured and how they relate to each other, and not primarily what can be proved in particular formal systems. It has been developed within the framework of Gentzen-style proof theory, as well as in categorial proof theory. As Dag Prawitz's monograph "Natural Deduction" (1965) paved the way for this development (he also proposed the term "General Proof Theory"), it is most appropriate to use this topic to celebrate 50 years of this work. The conference took place 27-29 November, 2015 in Tübingen at the Department of Philosophy. The proceedings collect abstracts, slides and papers of the presentations given, as well as contributions from two speakers who were unable to attend.

##
Thomas Piecha & Peter Schroeder-Heister, The Definitional View of Atomic Systems in Proof-Theoretic Semantics

In: P. Arazim & T. Lávička (eds), *The Logica Yearbook 2016*, College Publications, London, 2017, pp. 185-200.

**Abstract:** Atomic systems, that is, sets of rules containing only atomic formulas, play an important role in proof-theoretic notions of logical validity. We consider a view of atomic systems as definitions that allows us to discuss a proposal made by Prawitz (2016). The implementation of this view in the base case of an inductive definition of validity leads to the problem that derivability of atomic formulas in an atomic system does not coincide with the validity of these formulas. This is due to the fact that, on the definitional view of atomic systems, there are not just production rules, but both introduction and elimination rules for atoms, which may even generate non-normalizable atomic derivations. This shows that the way atomic systems are handled is a fundamental issue of proof-theoretic semantics.

##
Thomas Piecha & Peter Schroeder-Heister, Incompleteness of Intuitionistic Propositional Logic with Respect to Proof-Theoretic Semantics

In: T. Piecha and P. Schroeder-Heister (eds), Special issue on General Proof Theory, Studia Logica 107 (2019). Available online at https://doi.org/10.1007/s11225-018-9823-7 and via Springer Nature SharedIt at: https://rdcu.be/5dDs.

**Abstract:** Prawitz (1973, 2014) proposed certain notions of proof-theoretic validity and conjectured that intuitionistic logic is complete for them. Considering propositional logic, we present a general framework of five abstract conditions which any proof-theoretic semantics should obey. Then we formulate several more specific conditions under which the intuitionistic propositional calculus (IPC) turns out to be semantically incomplete. Here a crucial role is played by the generalized disjunction principle. Turning to concrete semantics, we show that prominent proposals, including Prawitz’s, satisfy at least one of these conditions, thus rendering IPC semantically incomplete for them. Only for Goldfarb’s (2016) proof-theoretic semantics, which deviates from standard approaches, IPC turns out to be complete. Overall, these results show that basic ideas of proof-theoretic semantics for propositional logic are not captured by IPC.

##
Thomas Piecha & Peter Schroeder-Heister (eds), Special issue on General Proof Theory

In: Studia Logica 107:1 (2019).

General Proof Theory: Introduction. Available online at https://doi.org/10.1007/s11225-018-9818-4 and via Springer Nature SharedIt at https://rdcu.be/6hGW.

**Abstract:** This special issue on general proof theory collects papers resulting from the conference on general proof theory held in November 2015 in Tübingen.

##
Peter Schroeder-Heister, Open problems in proof-theoretic semantics

In: T. Piecha & P. Schroeder-Heister (eds), Advances in Proof-Theoretic Semantics, Trends in Logic 43, Springer 2016, pp. 253-283.

**Abstract:** I present three open problems the discussion and solution of which I consider relevant for the further development of proof-theoretic semantics: (1) The nature of hypotheses and the problem of the appropriate format of proofs, (2) the problem of a satisfactory notion of proof-theoretic harmony, and (3) the problem of extending methods of proof-theoretic semantics beyond logic.

##
Peter Schroeder-Heister, Die Grundlagen mathematischer Beweise. Philosophisch-logische Überlegungen

In: Mitteilungen der Mathematischen Gesellschaft in Hamburg 38 (2018), pp. 117-137

**Abstract:** After a brief sketch of the origins of proof theory and of Hilbert's idea to regard proofs as mathematical objects, the problem of identity of proofs is discussed from the point of view of general proof theory and proof-theoretic semantics. Fundamental problems associated with the 'redundancy view', according to which proofs remain the same when obvious redundancies are removed, are pointed out. The formulation of a proper notion of harmony between logical rules is considered a way out of these problems. It is also emphasized that the annotation of logical (and extra-logical) rules applied in a proof is essential for the proof itself and thus a crucial part of it. This idea leads to type-theoretic conceptions of logical and mathematical reasoning.

##
Peter Schroeder-Heister & Luca Tranchini, Ekman's Paradox

In: Notre Dame Journal of Formal Logic 58 (2017), pp. 567-581. doi:10.1215/00294527-2017-0017. (Draft)

**Abstract:** Prawitz observed that Russell’s paradox in naive set theory yields a derivation of absurdity whose reduction sequence loops. Building on this observation, and based on numerous examples, Tennant claimed that this looping feature, or more generally, the fact that derivations of absurdity do not normalize, is characteristic of the paradoxes. Striking results by Ekman show that looping reduction sequences are already obtained in minimal propositional logic, when certain reduction steps, which are prima facie plausible, are considered in addition to the standard ones. This shows that the notion of reduction is in need of clarification. Referring to the notion of identity of proofs in general proof theory, we argue that reduction steps should not merely remove redundancies, but must respect the identity of proofs. Consequentially, we propose to modify Tennant’s paradoxicality test by basing it on this refined notion of reduction.

##
Peter Schroeder-Heister & Luca Tranchini, How to Ekman a Crabbé-Tennant

In: Synthese (2018), pp. 1–23. Available online at https://doi.org/10.1007/s11229-018-02018-3 and via Springer Nature Sharedit at: https://rdcu.be/bdbPp.

**Abstract:** Developing early results of Prawitz, Tennant proposed a criterion for an expression to count as a paradox in the framework of Gentzen’s natural deduction: paradoxical expressions give rise to non-normalizing derivations. Two distinct kinds of cases, going back to Crabbé and Tennant, show that the criterion overgenerates, that is, there are derivations which are intuitively non-paradoxical but which fail to normalize. Tennant’s proposed solution consists in reformulating natural deduction elimination rules in general (or parallelized) form. Developing intuitions of Ekman we show that the adoption of general rules has the consequence of hiding redundancies within derivations. Once reductions to get rid of the hidden redundancies are devised, it is clear that the adoption of general elimination rules offers no remedy to the overgeneration of the Prawitz–Tennant analysis. In this way, we indirectly provide further support for a solution to one of the two overgeneration cases developed in previous work.