Modeling hypothetical reasoning: validity, inference, and paradoxes

19-20 February 2015, IHPST, Paris

 

This is the final meeting of the ANR-DFG funded project HYPOTHESES, Hypothetical Reasoning – Its Proof-Theoretic Analysis. Its aim is to investigate the modeling of hypothetical reasoning via Gentzen’s calculi (both Natural Deduction and Sequent Calculus) and other calculi recently developed in this tradition (e.g. nested sequents and proof-nets). The main focus will be on validity, inference, and paradoxes, three crucial notions in evaluating the philosophical significance of such systems when applied to the analysis of hypothetical reasoning.

Speakers include:

  • Vito Michele Abrusci (Univ. Rome 3)
  • Thierry Coquand (Univ. Gothenburg)
  • Wagner de Campos Sanz (Univ. Federal de Goiás)
  • Jean Fichot (IHPST Univ. Paris 1)
  • Jean-Baptiste Joinet (Univ. Lyon 3)
  • Hidenori Kurokawa (Univ. Kobe)
  • Luiz Carlos Pereira (PUC Rio)
  • Mattia Petrolo (IHPST Univ. Paris 1)
  • Thomas Piecha (Univ. Tübingen)
  • Paolo Pistone (Univ. Rome 3/Aix-Marseille)
  • Stephen Read (Univ. St Andrews)
  • Giovanni Sambin (Univ. Padova)
  • Luca Tranchini (Univ. Tübingen)

 

Conference venue

Institut d'Histoire et de Philosophie des Sciences et des Techniques (IHPST)
Salle de conférences, 13, rue du Four, 75005 Paris, http://www-ihpst.univ-paris1.fr

 

Program

Thursday, 19 February 2015

9h30 - 10h30 Stephen Read: Inferentialism and the Logic of Identity
10h30 - 11h00 Coffee break
11h00 - 11h45 Luiz Carlos Pereira: Assumptions and the cut-rule in Natural Deduction
11h45 - 12h30 Thomas Piecha: Completeness in proof-theoretic semantics
12h30 - 14h30 Lunch
14h30 - 15h30 Thierry Coquand: Paradoxes in type theory
15h30 - 16h15 Jean Fichot: Conservativity and Analycity
16h15 - 16h45 Coffee break
16h45 - 17h30 Luca Tranchini: Harmony, rule equivalence and type isomorphism
17h30 - 18h15 Mattia Petrolo & Paolo Pistone: Untyped validity: from interaction to rules

Friday, 20 February 2015

9h30 - 10h30 Giovanni Sambin: A justification of intuitionistic logic by the principle of reflection
10h30 - 11h00 Coffee break
11h00 - 11h45 Hidenori Kurokawa: The principle of reflection via nested sequents
11h45 - 12h30 Wagner de Campos Sanz: Hypo: a constructivist semantics
12h30 - 14h30 Lunch
14h30 - 15h15 Jean-Baptiste Joinet: Individuals, characters and their sociality
15h15 - 15h45 Coffee break
15h45 - 16h30 Vito Michele Abrusci: Validity, inference, paradoxes: insights coming from linear logic

 

Titles and abstracts

Vito Michele Abrusci (Univ. Rome 3)
Validity, inference, paradoxes: insights coming from linear logic

Abstract: Linear logic may be presented by using traditional insights on the notion of validity (of a hypothetical reasoning), inference and paradox. But new methodological insights on these concepts arise from the logical investigations performed inside linear logic:

  1. to the traditional insights on the notion of validity (of a hypothetical reasoning) is added a new one (or, better, the rehabilitation of a kantian insight), valid reasoning as a reasoning satisfying some "conditions of possibility";
  2. to the traditional insights on the notion of inference is added a new one which may be defined as "geometrical insight";
  3. to the traditional insights on the notion of paradoxes is added the understanding of the role of structural rules in some kind of paradoxes, and the discovery that some kind of paradoxes interact very well with valid proofs and allow to better investigate the notions of poof (program) and proof-reduction (computation).

Thierry Coquand (Univ. Gothenburg)
Paradoxes in type theory

Abstract: We will first survey the study of paradoxes in dependent type theory, starting from Girard's paradox. Some variation of these paradoxes involves the axiom of description. We then try to explain in what way the univalent foundation program of Voevodsky might give new insight on these issues.

Wagner de Campos Sanz (Univ. Federal de Goiás)
Hypo: a constructivist semantics

Abstract: Hypo is a semantics given by means of clauses for each separate propositional constant and some clauses for dealing with its core concept: the concept of hypotheses. Semantic validity is defined for sentences depending on sets of hypothesis. Both, completeness and soundness of intuitionistic propositional logic with respect to Hypo are proved constructively. Hypo and Kripke semantics share noteworthy similarities. In a sense, Kripke semantics can be seen as a particular case of Hypo.

Jean Fichot (IHPST Univ. Paris 1)
Conservativity and Analycity

Abstract: The relations between analycity, as a property of proofs, and conservativity, as a property of theories, play a central role in a number of (given) arguments in favor of logical revisionism. I shall offer a critical but fair survey of this debate.

Jean-Baptiste Joinet (Univ. Lyon 3)
Individuals, characters and their sociality

Abstract: In first order logic, the metaphysical distinction between Individuals and Predicates is taken for granted. I will first analyze why the underlying extensional conception of semantics prevents any critical view on that unquestioned distinction. I will then move to the semantical ideas coming from programming languages theory (actional semantics) and will bring out the conditions under which, in that framework, individuals, singular characters, and collective behaviors (propositions) emerge.

Hidenori Kurokawa (Univ. Kobe)
The principle of reflection via nested sequents

Abstract: Sambin, Battilotti and Faggian (2000, JSL) propose a means of introducing logical constants by a proof-theoretic method which can broadly be understood as belonging to the framework of proof-theoretic semantics. Roughly speaking, the idea is 1) to use an equivalence formula as a "definition" of a logical constant, 2) to derive ordinary operational rules in a cut-free sequent calculus for the logical constant using only via minimal background assumptions, and 3) to understand various non-classical logics as structural variants. The idea works well for multiplicative and additive conjunction and disjunction. But handling implication in this manner is more complicated.

I will propose a means of treating implication in this framework using a proof theoretic technique known as nested sequents which adheres as closely as possible to Sambin et al.'s conceptual motivations. Nested sequents have been use in modal logic as a natural proof-theoretic representation of Kripke semantics. I will argue, however, that there are better means of motivating nested sequents, both technically and philosophically. On the philosophical side, I will first argue that there are conceptual problems arising in standard formulations of intuitionistic logic which point towards the adoption of strictly weaker logics. In formulate these concerns proof theoretically, it is often more convenient to use nested sequents rather than traditional sequent calculi. Once this step is undertaken, we can then show that nested sequents can be used to formulate a variety of non-classical logics in a uniform way. Time permitting, I will also discuss how nested sequents can be compared with other generalizations of sequent calculi.

Luiz Carlos Pereira (PUC Rio)
Assumptions and the cut-rule in Natural Deduction

Abstract: In 1938 Gerhard Gentzen proved for the third time the consistency of arithmetic. In Gentzen's own words, in this third proof "the main emphasis will be placed on developing the fundamental ideas and on making every single step of the proof as lucid as possible." It is clear now, especially after the recent illuminating work of Prawitz on the normalization for arithmetic, that these fundamental ideas were related to a normalization strategy for natural deduction. In 1996 Haeusler and Pereira showed how the reductions defined by Gentzen in this third proof could be used to prove a strong cut-elimination result for classical and intuitionistic propositional logic. In a recent [still unpublished] paper, Prawitz was able to use, "in a seemingly magical way", the techniques introduced by Gentzen in this third proof in order to prove a normalization theorem directly for a natural deduction formulation of Peano's Arithmetic. Prawitz's proof solved a problem that resisted a satisfactory solution for more than 35 years: how to extend Gentzen's proof to natural deduction, in particular, how to define the notion of level line, so fundamental in the case of sequent calculus, directly for natural deduction. The aim of the present paper is to explore the way Prawitz' result deals with the problem of assumption-substitution in order to obtain a new proof of strong normalization.

(Joint work with Edward Hermann Haeusler)

Mattia Petrolo (IHPST Univ. Paris 1) & Paolo Pistone (Univ. Rome 3/Aix-Marseille)
Untyped validity: from interaction to rules

Abstract: Proof-theoretic validity is usually defined as a property of derivations built according to a previously defined set of rules (see e.g. Prawitz [1971]). Yet, "validity should be a distinguishing feature, telling that some derivations are valid while some others are not" (Schroeder-Heister [2006]). As a consequence, a wider setting should be investigated, allowing for invalid derivations. To this end, we introduce Pure Natural Deduction, in which derivations are defined as abstract trees. Pure (or "untyped") derivations can be regarded as possibly invalid: they reproduce the computational behavior of pure lambda-terms, including the possible violation of the normalization property. On the other hand, intuitionistic natural deduction (NJ) derivations can be seen as "typed", i.e. as trees whose nodes and branches are labeled, respectively, by formulae and rules. For every formula A, we define validity with respect to A as a property of pure derivations (in the style of Tait-Girard reducibility technique for typed lambda-calculi). Finally, we show that NJ is complete in the following sense: for every formula A, if d is a closed normal pure derivation which is valid with respect to A, then a closed normal NJ derivation of A can be recovered from d. As a corollary of our completeness theorem, we are able to recover the usual properties of validity à la Prawitz.

Thomas Piecha (Univ. Tübingen)
Completeness in proof-theoretic semantics

Abstract: We present and discuss completeness and incompleteness results in proof-theoretic semantics.

Stephen Read (Univ. St Andrews)
Inferentialism and the Logic of Identity

Abstract: Inferentialism claims that the rules for the use of an expression give its meaning without any need to invoke meanings or denotations for them. Logical inferentialism endorses inferentialism specifically for the logical constants. What, then, are the logical constants and how can they be identified? It is conjectured here that the logical expressions are those which can be given schematic rules that lie in a certain sort of harmony, general-elimination harmony. Owen Griffiths (RSL 2014) claims that identity cannot be given such rules, concluding that logical inferentialists are committed to ruling identity a non-logical expression. It is shown that the rules for identity given in Read (Analysis 2004) are indeed suitably harmonious, so demonstrating that identity is indeed a logical notion.

Giovanni Sambin (Univ. Padua)
A justification of intuitionistic logic by the principle of reflection

Abstract: A logical constant is said to be obtained by the principle of reflection (Sambin, Battilotti and Faggian, Basic Logic: reflection, symmetry, visibility, JSL 2000) if it can be seen as the reflection into object language of a link between assertions in the metalanguage. I have often underlined the fact that all logical constants, in many logics, can be obtained via the principle of reflection. In this talk I will concentrate on the specific case of intuitionistic logic. The aim is to discuss in detail conceptual novelties and virtues of the principle of reflection itself. I will argue in particular that it provides a fully convincing justification of intuitionistic logic which relies on a minimal amount of assumptions.

Luca Tranchini (Univ. Tübingen)
Harmony, rule equivalence and type isomorphism

Abstract: One way of explaining the notion of harmony is by using the inversion principle. I will draw the attention to the fact that many advocates of the inversion principle implicitly adopt a notion of equivalence between rules. Only together with a notion of equivalence does the inversion principle yield a thorough account of harmony. The reason is that the elimination rules generated by inversion are not the only elimination rules in harmony with a given collection of introduction rules. Intuitively, any collection of elimination rules equivalent (i.e. interderivable) with the one generated by inversion will also be in harmony with the given collection of introduction rules.

By considering some examples, I will first argue that this picture yields a notion of meaning which is too much "extensional". Furthermore, I will suggest that in order to attain a more intensional account, one should abandon the notion of equivalence in favor of a more stricter notion, to be modeled upon the one of type isomorphism from lambda calculus.

 

Organizers

Jean Fichot (jean.fichot@univ-paris1.fr)
Jean-Baptiste Joinet (jean-baptiste.joinet@univ-lyon3.fr)
Mattia Petrolo (mattia.petrolo@univ-paris1.fr)