29-30 May, 2014, IHPST, Paris


Since the introduction by Gentzen of natural deduction and sequent calculus in the Thirties, the study of hypothetical reasoning became one of the central aspects of the (structural) analysis of formal derivations. The aim of this conference is to investigate some recent developments of this tradition and evaluate their philosophical significance.


This conference is part of the ANR-DFG funded project HYPOTHESES, Hypothetical Reasoning – Its Proof-Theoretic Analysis.

Speakers include:

  • Vito Michele Abrusci (Univ. Rome 3)
  • Michael Arndt (Univ. Tübingen)
  • Federico Aschieri (ENS Lyon)
  • Nicolas Clerbout (Univ. Lille 3)
  • Ole Hjortland (Univ. Münich)
  • Jean-Baptiste Joinet (Univ. Lyon 3)
  • Jon Litland (Univ. Olso)
  • Alberto Naibo (IHPST Univ. Paris 1)
  • Luiz Carlos Pereira (PUC Rio)
  • Mattia Petrolo (IHPST Univ. Paris 1)
  • Thomas Piecha (Univ. Tübingen)
  • Jan von Plato (Univ. Helsinki)
  • Shahid Rahman (Univ. Lille 3)
  • Tor Sandqvist (KTH Stockholm)
  • Peter Schroeder-Heister (Univ. Tübingen)
  • Göran Sundholm (Univ. Leiden)
  • Luca Tranchini (Univ. Tübingen)

Conference venue

Institut d'Histoire et de Philosophie des Sciences et des Techniques (IHPST)
Salle de conférences, 13, rue du Four, 75005 Paris, http://www-ihpst.univ-paris1.fr


Thursday, 29 May 2014

9h15 - 9h30Opening Panel, chaired by Per Martin-Löf
9h30 - 10h30Jan von Plato: Reasoning about generality and existence
10h30 - 10h45Coffee break
10h45 - 11h45Michael Arndt: Eight inference rules for implication
11h45 - 12h45Göran Sundholm: Assumptions
13h00 - 14h30Lunch
14h30 - 15h30Luca Tranchini: Ekman's paradox and generalized natural deduction inferences
15h30 - 16h30Jon Litland: The completeness and stability of intuitionistic propositional logic
16h30 - 17h00Coffee break
17h00 - 18h00Michele Abrusci: Hypothetical reasoning and linear logic
18h00 - 19h00Nicolas Clerbout & Shahid Rahman: Linking game-theoretical approaches with constructive type theory: dialogical strategies as CTT-demonstrations

Friday, 30 May 2014

9h30 - 10h30Peter Schroeder-Heister: Harmony in proof-theoretic semantics: The issue of propositional quantification
10h30 - 10h45Coffee break
10h45 - 11h45Tor Sandqvist: Base-extension semantics and second-order quantification
11h45 - 12h45Luiz Carlos Pereira: Assumptions in natural deduction
13h00 - 14h30Lunch
14h30 - 15h30Jean-Baptiste Joinet: The computational basis of semantics
15h30 - 16h30Ole Hjortland: An n-sided sequent calculus for paraconsistent and paracomplete theories of truth
16h30 - 17h00Coffee break
17h00 - 18h00Federico Aschieri, Alberto Naibo & Mattia Petrolo: A hypotheses-based analysis of proof-theoretic validity
18h00 - 19h00Thomas Piecha: The completeness problem in proof-theoretic semantics

Titles and abstracts

Michele Abrusci (Univ. Rome 3)
Hypothetical reasoning and linear logic

Abstract: I will discuss the proof-theoretical and philosophical contributions arising from Linear Logic and its developments to the questions formulated inside the research project on "Hypothetical reasoning".

Michael Arndt (Univ. Tübingen)
Eight inference rules for implication

Abstract: We provide an exhaustive list of all the possible inference rules that are related to the left implication inference rule of the sequent calculus in the sense that all of them are derivable from the same logical ground sequent. We also consider the inference rules in the calculus of natural deduction that correspond to these eight inference rules. For the sequent calculus, we discuss the similarities and differences that they exhibit as well as their interaction with the implication right rule under cut and identity. For natural deduction, we discuss the phenomenon that several of the rules cannot even be distinguished unless the formal distinction between inference rules and derivation rules is taken seriously and extended further. We conclude that ground sequents are a much more suitable means to express logical content in a sequent calclulus than inference rules.

Federico Aschieri (ENS Lyon), Alberto Naibo & Mattia Petrolo (IHPST Univ. Paris 1)
A hypotheses-based analysis of proof-theoretic validity

Abstract: Classical logic can be interpreted computationally as a process of making arbitrary hypotheses, testing them and revising them if they are wrong. This interpretation is much more informative than the one resulting from the usual negative translations of classical into intuitionistic logic, which are implicitly used whenever the disjunction and the existential quantifier are dropped from the logical language. Inspired from that, we introduce a new notion of proof-theoretic validity based on hypotheses, of which Prawitz's validity for intuitionistic logic turns out to be just a particular case. Our notion of validity introduces a new distinction between two kinds of hypotheses: simple assumptions as place-holders for proofs and what can be called conjectures (in the spirit of Lakatos). Such an analysis leads to a justification of classical logic with the full set of logical constants in which the verificationist structural properties are preserved. On this basis, the distinction between classical and intuitionistic logic can be interpreted as a distinction between the use of different kinds of hypotheses rather than different kinds of inference rules.

Nicolas Clerbout & Shahid Rahman (Univ. Lille 3)
Linking game-theoretical approaches with constructive type theory: dialogical strategies as CTT-demonstrations

Abstract: The development of a dialogical approach to Constructive Type Theory (CTT) can be motivated by the following considerations:

1. In his book The Interactive Stance, J. Ginzburg (2011) stresses the utmost importance of taking conversational (interactive) aspects into account in order to develop a theory of meaning, where meaning is constituted during the interaction. In order to implement such a theory of meaning Ginzburg makes use of Constructive Type Theory (CTT) where the so called "metalogical" rules that constitute meaning are explicitly imported into the object language. Moreover, Ginzburg designed some kind of language games called dialogical-gameboards that should be in order to capture the dynamic aspects of every-day dialogues.

Now, if we take seriously the claim that meaning is constituted by and within interaction then we expect that the semantics of the underlying logical elements is also understood dialogically. In this context, a dialogical approach to constructive type theory provides both a dialogical frame for the underlying logic and a natural link to the dialogical gameboards.

2. In some recent work Peter Dybjer (2012) proposed to study the relation between the intentional and the extensional notions of identity, defined in CTT by game theoretical means. Once more, in this context an approach to CTT that is already game-theoretical seems to be desirable.

3. The links between game-theoretical approaches and CTT seem to be very natural. Indeed, the CTT approach is very natural to the dialogical frame where the meaning of the logical constants is given by moves such as challenges and choices that constitute the explicit development of a play. Challenges and choices are in fact part of the object language of the development of a play.

4. The fathers of dialogical logic had the project of developing both a general theory of meaning (not only of logic) and to open a new path on the foundations of (constructive) mathematics and logic. Here, too, it is CTT that enriches the dialogical approach.

Given these motivations it is important to provide a proof that the dialogical approach to CTT developed by the two authors in recent papers yields a notion of winning strategy that really corresponds to the notion of CTT demonstration: this is the main aim of the present study. More, precisely, we consider Martin Löf's Constructive Type Theory on the one hand, and dialogues with play-objects5 on the other hand. Our purpose is to show the following:

There is a winning P-strategy in the dialogical game for φ if and only if there is a CTT-demonstration for p : φ, where p is some proof-object for φ.

One important aspect of the present study is that we restrict ourselves to the logically valid fragment of CTT. The reason is that, once that fragment is achieved then we can extend the result to cover the whole CTT system. The present work is thus organized as follows:

First, we recall the two frameworks at stake, insisting in particular on the dialogical framework.
Then we focus on the demonstration of the left-to-right direction of the equivalence result.

We illustrate the use of the algorithm obtained from the theorem by showing how to transform a specific winning strategy into a CCT-demonstration of the axiom of choice and back.

The last part of the study deals with the algorithm from CTT-demonstrations to dialogical strategies and thus with the right-to-left direction of the equivalence result.

We conclude by introducing elements of discussion which are to be developed in subsequent papers.

Ole Hjortland (Univ. Münich)
An n-Sided Sequent Calculus for Paraconsistent and Paracomplete Theories of Truth

Abstract: We present an n-sided sequent calculus for paraconsistent and paracomplete theories of truth, and compare these with previous proof theoretic approaches to such theories, especially that given by Halbach and Horsten (2006). The n-sided framework offers a common proof theoretic framework for such theories, and also an interesting inferential diagnosis of paradoxes.

(Joint work with Johannes Korbmacher.)

Jean-Baptiste Joinet (Univ. Lyon 3)
The computational basis of semantics

Abstract: In my talk, I will start bringing out two weaknesses of the standard logical approach of semantics (set theoretical semantics for predicate logic formulas) w.r.t. Frege’s distinction between Sense and Reference, namely that this standard approach implies:

  1. an artificial limitation of the domain of application of the concept of Sense,
  2. an artificial orientation of the relationship between Sense and Reference.

I then will advocate that to overcome those weaknesses, one need to replace (as the starting point) the ordinary semantical question: What is this text referring to?, by a pragmatical question: What are the effects of this text? How does it act? The most emblematic languages to abstractly address this idea of an "actional semantics" of language are the programming languages of computing theory. Indeed, by nature, a «program» is a text whose meaning is actional; it is but an acting text.

From there, I will review the counterparts of usual basic concepts of semantics (sense/reference, consistency, individuals ...) in the computational world, I will come back to the "usual" logical statements and will quickly sketch the solutions brought by the proofs-as-programs paradigm (generalizations of the so called Curry-Howard correspondence and of the Formulas-as-Types approach) to the two major obstacles that encounters, at first sight, the Wittgenstein-Dummett's claim that the meaning of statements lies in their proofs.

Jon Litland (Univ. Olso)
The completeness and stability of intuitionistic propositional logic

Abstract: I show how to construct a proof-theoretic meaning-theory for the propositional logical constants based on elimination rules. I show that the resulting logic is exactly intuitionistic propositional logic. Having such a meaning-theory I give a precise characterization of the notion of proof-theoretic stability. It is then possible to show, on fairly mild assumptions, that the regular intuitionistic elimination rules are stable.

Luiz Carlos Pereira (PUC Rio)
Assumptions in Natural Deduction

Abstract: Natural deduction systems are historically and conceptually connected to assumption formation. Proofs are defined as derivations where all assumptions are discharged. In Prawitz's Natural Deduction the systems for classical, intuitionistic and minimal logic are first defined with no discharging strategy (or with the so-called "savage" discharging strategy). Discharging functions were not used before the chapter to modal logic. We know that although there is no difference in deductive power, the adoption of no discharging strategy/function has undesirable proof-theoretical consequences:

  1. No strong normalization;
  2. No unicity of normal form;
  3. No Curry-Howard.
The aim of this presentation is to examine the role of dependency relations and discharging functions in proof-theory.

Thomas Piecha (Univ. Tübingen)
The completeness problem in proof-theoretic semantics

Abstract: Several proof-theoretic notions of validity have been proposed in the literature, for which completeness of intuitionistic logic has been conjectured. Arbitrary extensions of atomic systems are an integral part of these notions. We consider a notion of validity, which captures the common core of these notions, and we refute the completeness conjecture for it.

(Joint work with W. de Campos Sanz and P. Schroeder-Heister.)

Jan von Plato (Univ. Helsinki)
Reasoning about generality and existence

Abstract: Frege explained generality by stating that each its instance is a fact, and added only later the crucial observation that generality can be inferred from an arbitrary instance. The reception of Frege’s quantifiers was a fifty-year struggle over a conceptual priority: truth or provability. With the former as the basic notion, generality had to be faced as an infinite collection of facts, whereas with the latter, generality was based on a uniformity with a finitary sense: the provability of an arbitrary instance.

Tor Sandqvist (KTH Stockholm)
Base-extension semantics and second-order quantification

Abstract: The strategy of evaluating conditional claims in terms of arbitrary extensions of atomic inference systems, while traditionally associated with intuitionistic logic, has recently been found capable of providing semantic foundations for classical deduction as well. In my talk I will report on an ongoing effort to extend this approach to the realm of second-order logic. By delimiting the format of atomic inference rules in suitable ways, it is possible to make out as logically valid not only the law of excluded middle, but also various existential claims, inter alia comprehension for arbitrary first-order formulae. After outlining how such results come about, I will address the question of whether, and if so in what sense, they can be thought of as lending support to a realist conception of mathematical objects.

Peter Schroeder-Heister (Univ. Tübingen)
Harmony in proof-theoretic semantics: The issue of propositional quantification

Abstract: I give a criterion of harmony between introduction and elimination rules which is based on quantified rules or, alternatively, on second-order intuitionistic propositional logic. It can be shown that introduction and elimination rules are not always on the same level. When passing from introductions to harmonious eliminations one (provably) sometimes needs to raise the level of rules. And when passing from eliminations to harmonious introductions one sometimes not only needs to raise the level of rules, but also must use propositional quantification.

Göran Sundholm (Univ. Leiden)

Abstract: The notion of assumption is pivotal for the understanding of Natural Deduction techniques in proof theory. Consideration of only uninterpreted formal systems metamathematically construed has led to locutions such as:

"Assume φ"

where φ is a well-formed formula, that is, a (meta-)mathematical object, and hardly open to be assumed. In my lecture I shall consider the pragmatics of assumptions, and draw a crucial distinction between (a) antecedent propositions in consequence relations, and (b) epistemic assumptions that premisses are known. The effect of this distinction will be spelled out via a comparison between Gentzen's two Natural Deduction formats for representing derivations, namely, the 1932 format with open assumption formulae, and the 1936 sequential formulation that lists open assumptions as antecedents in sequents.

Historical connections to Frege, Bolzano, and Meinong will be indicated. Drawing upon the Curry-Howard "isomorphism", with derivations in fully explicit form (with appended proof-terms), a novel kind of reduction will be considered.

The role of implications A ⊃ B, when added as novel axioms, will be considered in connection with the use of assumptions. Many alleged "axioms" that have been considered in current discussions of the philosophical foundations for mathematical constructivism will be rejected as lacking in proper motivation. Prominent examples are Kreisel's "Axiom of Christian Charity", from the theory of the Creative Subject, and the "knowability axiom" A ⊃ ◊ KA that is familiar from the so called "Fitch's Paradox".

Luca Tranchini (Univ. Tübingen)
Ekman's paradox and generalized natural deduction inferences

Abstract: Inspired by the failure of normalizability in natural deduction systems for naive set theory, Jan Ekman proposed a natural extension of the notion of reduction relative to which normalization in the Gentzen-Prawitz natural deduction system for intuitionistic propositional logic (IPL) fails. Jan von Plato argued that Ekman's counterexample to normalization can be blocked by the adoption of general elimination rules. We show that this is not the case, by defining a reduction procedure for the formulation of IPL with general elimination rules which is based on the harmony between general elimination and general introduction rules.

(Joint work with Peter Schroeder-Heister.)



Jean Fichot (jean.fichot@univ-paris1.fr)
Alberto Naibo (alberto.naibo@univ-paris1.fr)
Mattia Petrolo (mattia.petrolo@univ-paris1.fr)