Titles and Abstracts of Contributed Talks

(Click on the titles to show the abstracts.)

(*) Speakers affiliated to the Tübingen logic group

Matteo Acclavio & Lutz Straßburger (Inria Saclay & LIX, École Polytechnique): From Syntactic Proofs to Combinatorial Proofs

Abstract: Proof theory plays an important role in many areas of computer science. As the name suggests, proof theory aims to study proofs as mathematical objects and their interactions. However, unlike many other mathematical fields, proof theory is not able to identify its objects: we do not have a clear understanding of when two proofs are the same.
In this talk we investigate Hughes' combinatorial proofs as a notion of proof identity for classical logic.
We show how to represent proofs form various syntactical formalisms including sequent calculus, analytic tableaux, and resolution by means of combinatorial proofs. For each of these formalisms, a natural notion of proof identity is given by certain rules permutations. We compare these notions with the one combinatorial proofs enforces. This allows to compare proofs given in different formalisms.


(*) Michael Arndt (University of Tübingen): The Role of Structural Reasoning in the Genesis of Graph Theory

Abstract: The seminal book on graph theory by Dénes Kőnig, published in the year 1936, collected notions and results from precursory works from the mid to late 19th century by Hamilton, Cayley, Sylvester and others. More importantly, Kőnig himself contributed many of his own results that he had obtained in the more than 20 years that he had been working on this subject matter.
What is noteworthy from a logician's perspective is the fact that the fundamentals of what he calls directed graphs are taken almost exhaustively from Paul Hertz’ 1922 article on structural reasoning about sentences of the form ab. This is not a fact that is well known, neither in logical nor in graph theoretical circles, even though Kőnig fully acknowledges it in his book. In view of the numerous trends in the recent decades to describe and explicate logical matters by means of graphs, the fact that it was Hertz’ foundation of structural reasoning that informed basic notions of graph theory in the first place is highly significant.
The main goal of this talk is to summarize Hertz’ article and demonstrate how Kőnig integrated the notions and results presented therein in his book. This is followed by an exposition of how and when Hertz’ results were reinvented in terms of graph theory. A critical assessment of the opinion expressed by both Hertz and Kőnig that the more general sentences of the form (a1,...,an) → b, introduced by Hertz in a companion article in 1923, cannot be interpreted by graphs concludes this talk.


Sara Ayhan (Ruhr-University Bochum): The meaning of proofs in different proof systems

Abstract: In this talk I want to address the question of how the meaning of proofs can be determined from the viewpoint of proof-theoretic semantics (PTS). The origins of PTS lie in the question of what constitutes the meaning of the logical connectives and its response: the rules of inference that govern the use of the connective. However, what if we go a step further and ask about the meaning of a proof as a whole? From a PTS point of view it seems a sensible approach to base the meaning of a proof on the rules it contains. This can be extended by two questions: It can firstly be asked what the sense of a proof actually consists of. Asking this question also means to dive into the question whether similar distinctions as Frege’s in the case of singular terms or sentences are possible for proofs, e.g. if we can distinguish a mere syntactic divergence from a divergence in meaning. If we have two (syntactically) different derivations, does this always lead to a difference, firstly, in sense and, secondly, in denotation? The second question would then be, whether there are differences in determining the sense of proofs for different calculi (in this case: natural deduction vs. sequent calculi).


Constantin C. Brîncuș (University of Bucharest): Are the Open-Ended Rules for Negation Categorical?

Abstract: Van McGee has recently argued that Belnap’s criteria constrain the formal rules of classical natural deduction to uniquely determine the semantic values of the logical connectives and quantifiers if the rules are taken to be open- ended, i.e., if they are truth-preserving within any mathematically possible extension of the original language. An assumption of his argument is that for any class of models there is a mathematically possible language in which there is a sentence true in just those models. This assumption, however, is problematic for the class of models of classical propositional logic. I argue that the existence of non-normal models for the classical propositional connectives, and in particular for negation, i.e., models for which the calculi remain sound and complete, but in which the logical constants have different meanings than the standard ones, undermines McGee’s argument.

Extended abstract (including references)


Wagner de Campos Sanz (UFG Brazil): Hypo: a simple constructive semantics for Intuitionistic Sentential Logic, soundness and completeness

Abstract


Bogdan Dicher (University of Lisbon) & Francesco Paoli (University of Cagliari): The original sin of proof-theoretic semantics

Abstract: Proof-theoretic semantics is an alternative to model-theoretic semantics. It aims to explain the meaning of the logical constants in terms of the inference rules that govern their behaviour in proofs. We argue that this must be construed as the task of explaining these meanings relative to a logic, i.e., to a consequence relation. Alas, there is no agreed set of properties that a relation must have in order to qualify as a consequence relation. Moreover, the association of a consequence relation to a logical calculus is not as straightforward as it may seem. We show that these facts are problematic for the proof-theoretic project but the problems can be solved. Our thesis is that the consequence relation relevant for proof- theoretic semantics is the one given be the sequent-to-sequent derivability relation in Gentzen systems.


(*) René Gazzari (University of Tübingen): The Calculus of Natural Calculation

Abstract: Gentzen's calculus of Natural Deduction is, indeed, a natural representation of informal (mathematical) reasoning. But mathematicians do not only reason in their proofs, they also have to calculate with mathematical objects represented by the terms of a formal language; the traditional formalisations of such calculations seem to be artificial.
The calculus of Natural Calculation is an extension of Gentzen's calculus by suitable term rules allowing for a more natural treatment of equalities and, therefore, an adequate representation of the calculations found in informal proofs. We present this extension, and discuss some of its central properties. We conclude our talk with a sketch of some other extensions of Natural Deduction motivated by the introduction of term rules.


Marianna Girlando (Aix-Marseille University and University of Helsinki) & Nicola Olivetti (Aix-Marseille University): Nested sequents for Lewis’ counterfactual logics

Abstract: Counterfactual logics are a family of modal logics introduced by Lewis to formalise conditional sentences of the form “if A was the case, then B would have been the case”. These logics extend the language of classical propositional logics with a two-place modal operator standing for comparative plausibility, by means of which the counterfactual conditional can be defined. The semantics of counterfactual logics is given in terms of neighbourhood models with the property that the family of neighbourhoods associated to each world is nested. In a previous paper, we defined sequent calculi for some logics in Lewis’ family by enriching the structure of sequents by means of conditional blocks, which are syntactic objects standing for disjunctions of comparative plausibility formulas. In the present work we define nested sequents for counterfactual logics. The main advantages of these proof systems is that they are fully invertible (differently from the sequent calculi), and allow one to obtain a direct completeness proof with respect to neighbourhood models. Finally, the nested structure of a sequent can be represented by a tree of labels, threfore obtaning a natural translation with labelled proof systems.

Extended abstract (including references)


Guido Gherardi (Università di Bologna), Paolo Maffezioli (Universidad de Barcelona) & Eugenio Orlandelli (Università di Bologna) (University): Interpolation in extensions of first-order logic

Abstract: We provide a constructive proof of Craig's interpolation theorem for extensions of classical and intuitionistic first-order logic with a special type of geometric axioms, called singular geometric axioms. As a corollary, we obtain a direct proof of interpolation for (classical and intuitionistic) first-order logic with identity, as well as interpolation for several mathematical theories including strict partial orders, apartness, positive partial orders and positive linear orders.


(*) Edward Hermann Haeusler (PUC-Rio) & Lew Gordeev (University of Tübingen): Huge proofs, redundant proofs and some reasons in favor of NP=PSPACE

Abstract: We discuss an argument in favour of NP=PSPACE. Any exponentially sized linearly bounded height proof P of a A in implicational minimal logic is highly redundant. This is expressed by the fact that there is at least one derivation *P that occurs exponentially many times as sub derivation of P. This is a consequence that any tree-like proof is labeled with linearly many formulas (subformulas from A) and the proof is linearly height-bounded. May exists more than one (different) derivation that occur exponentially many times as sub-derivations of P. They and the way that they glue in each other to form the proof itself raises a kind of spectral analysis of proofs, components are the occurring derivations and the analysis is the way they combine, by means of repetitions to the whole proof. We decompose exponentially linearly height-bounded proof into, somehow, combinations of polynomially sized derivations. This combination resembles the horizontal compression method we presented previously. We show a new horizontal compression, based on rewriting rules, that obtains a polynomially sized dag-like (compressed) proof of A. We provide polynomial algorithm for the verification of the validity of dag-like proofs.


Ansten Mørch Klev (Czech Academy of Sciences, Prague): The Logic and Harmony of Identity

Abstract: The standard natural deduction rules for the identity predicate have seemed to some not to be harmonious. Stephen Read has suggested an alternative introduction rule that restores harmony but requires second-order logic for its proper functioning. I shall argue that the standard rules are in fact harmonious. To this end, natural deduction will be enriched with a theory of definitional identity. This leads to a novel conception of what counts as a canonical derivation, on the basis of which the identity elimination rule may be justified.


Hidenori Kurokawa (Kanazawa University) & Alberto Naibo (IHPST, Paris): Stability in Sequent Calculus

Abstract: In this paper we argue that, from the perspective of proof-theoretic semantics, the proof transformation associated with Dummett's notion of stability becomes more perspicuous when looked through the lens of sequent calculus. From such a perspective, our aim is threefold. First, we analyze a generalized form of proof-expansion recently discussed in the literature (esp. by Luca Tranchini) as a formal way to characterize stability. We show that this expansion can be understood in terms of a specific proof transformation in sequent calculus, i.e the possibility of permuting upward a cut with respect to a derivation of an identity sequent. Secondly, we use the sequent calculus framework in order to consider a dual version of stability (where the duality consists in changing of the position of the identity derivation with respect to the cut). We show that when no structural rules are involved, stability and its dual allow one to separate the style of the connectives. Finally, we claim that our approach based on sequent calculus allow one to reconsider the nature of harmony and stability, as both take part in the process of cut elimination. In the case of the most standard connectives, harmony uniformly corresponds to reduction of the complexity of a cut, while stability uniformly corresponds to reduction of the height of a cut. We explore how far this idea can be extended.


Clément Lion (University of Lille): A dialogical reconstruction of Brouwer's creating subject?

Abstract: The problem of the creating subject has been widely rejected amongst mathematicians due to its vagueness. Even Kreisel's attempt to reconstruct it through an axiomatic theory, using a new connective, has been recently confronted with the impossibility of supplying it with genuine meaning explanations without postulating a non-constructive proof. Our proposal is to investigate how a dialogical treatment of the problem could be elaborated out of the peircian distinction between iconic, indexical and symbolical functions of a sign. The semantic problems underlying Kreisel's axiomatic reconstruction would concern only an objectified (and symbolic) form of subjectivity, which is only at stake at a "strategic level".

Extended abstract (including references)


Zhaohui Luo (Royal Holloway, University of London): Formal Semantics in Modern Type Theories: An Overview

Abstract: I'll give an overview, and report some recent developments, of Formal Semantics in Modern Type Theories (MTT-semantics) by focussing on:
  (1) The rich structures in MTTs, together with subtyping, make MTTs a nice and powerful framework for formal semantics of natural language;
  (2) MTT-semantics is both model-theoretic and proof-theoretic and hence a very attractive semantic framework.
By explaining the first point, we'll introduce MTT-semantics and, at the same time, show that the use and development of coercive subtyping play a crucial role in making MTT-semantics viable. The second point shows that MTTs provide a unique and nice semantic framework that was not available before for linguistic semantics. Being model-theoretic, MTT-semantics provides a wide coverage of various linguistic features and, being proof-theoretic, its foundational languages have proof-theoretic meaning theory based on inferential uses (appealing philosophically and theoretically) and it establishes a solid foundation for practical reasoning in natural languages on proof assistants such as Coq (appealing practically). Altogether, this strengthens the argument that MTT-semantics is a promising framework for formal semantics, both theoretically and practically.

Extended abstract (including references)


Yoshihiro Maruyama (Kyoto University): Harmony, Higher-Order Rules, and the Curry-Howard-Lambek Correspondence

Abstract: Different instances of Curry-Howard-Lambek correspondence have been developed; yet a general mechanism underpinning them is still unclear. Building upon the theory of higher-order rules, we explicate a uniform mechanism to derive from logical constants the corresponding categorical constructions, and thereby establish the logic-category correspondence at a general level on the basis of general-elimination harmony (GE-harmony for short). This means that general-elimination harmony yields the Curry-Howard-Lambek correspondence: any logical constants harmoniously definable within the system of higher-order rules can in principle be given categorical semantics. We also have a loot at several examples of this general mechanism, one of which is a contradictory constant as in the liar paradox, corresponding to certain "liar categories." Since not all logical constants defined via GE harmony come from adjoint situations, we also compare GE harmony with Lawverian categorical harmony. To this end, we revist the idea of structural functors by Kosta Dosen, which allows us to revise and sharpen Lawverian harmony. We then consider how many of GE-harmonious logical constants come from categorical adjunctions, or how many of them are categorically harmonious as well.

Extended abstract (including references)


(*) Hermógenes Oliveira (University of Tübingen): Adequacy for a proof-theoretic semantics based on elimination rules

Abstract: I will prove completeness and soundness of propositional intuitionistic logic with respect to an adaptation of Dummett's proof-theoretic notion of validity based on elimination rules.

Detailed exposition


Mattia Petrolo (Federal University of ABC): Proof-Theoretic Semantics and Paradoxical Languages

Abstract: In proof-theoretic semantics there are two main approaches to the definition of validity for derivations, one based on introduction rules and the other on elimination rules. In the context of usual systems for intuitionistic logic, these approaches enjoy an important symmetry: both allow to show that all derivations are valid. However, this symmetry breaks in the case of proof systems dealing with paradoxical derivations. To substantiate this claim we exploit two different analyses of paradoxes in natural deduction. We show that in one case elimination rules cannot be justified starting from introduction rules and, in the other, introduction rules cannot be justified starting from elimination rules. As a consequence, the proof-theoretic semantics of paradoxical languages arising from the introduction-based and the elimination-based approach might be very different. Indeed, we argue that these two semantic frameworks open the way to two different notions of paradoxality in proof-theoretic semantics: either as failure of some compositional principle (e.g. MP or cut), as it would result from an introduction-based approach, or as some notion of partial function, as it would result from an elimination-based approach.


Mario Piazza (SNS, Pisa) & Gabriele Pulcini (Universidade Nova de Lisboa): A New Approach to Proof-Theoretic Semantics for Classical Logic

Abstract


(*) Thomas Piecha (University of Tübingen): Abstract semantic conditions and the incompleteness of intuitionistic propositional logic with respect to proof-theoretic semantics

Abstract: In [1] it was shown that intuitionistic propositional logic is semantically incomplete for certain notions of proof-theoretic validity. This questioned a claim by Prawitz, who was the first to propose a proof-theoretic notion of validity, and claimed completeness for it [3, 4]. In this talk we put these and related results into a more general context [2]. We consider the calculus of intuitionistic propositional logic (IPC) and formulate five abstract semantic conditions for proof-theoretic validity which every proof-theoretic semantics is supposed to satisfy. We then consider several more specific conditions under which IPC turns out to be semantically incomplete.
In validity-based proof-theoretic semantics, one normally considers the validity of atomic formulas to be determined by an atomic system S. This atomic system corresponds to what in truth-theoretic semantics is a structure. Via semantical clauses for the connectives, an atomic base then inductively determines the validity of formulas with respect to S, called 'S-validity' for short, as well as a consequence relation between sets of formulas and single formulas. We completely leave open the nature of S and just assume that a nonempty finite or infinite set of entities called 'bases' is given, to which S belongs. We furthermore assume that for each base S in such a set a consequence relation is given. The relation of universal or logical consequence is, as usual, understood as transmitting S-validity from the antecedents to the consequent. We propose abstract semantic conditions which are so general that they cover most semantic approaches, even classical truth-theoretic semantics. We then show that if in addition certain more special conditions are assumed, IPC fails to be complete. Here a crucial role is played by the generalized disjunction principle. Several concrete notions of proof-theoretic validity are considered, and it is shown which of the conditions rendering IPC incomplete they meet.
From the point of view of proof-theoretic semantics, intuitionistic logic has always been considered the main alternative to classical logic. However, in view of the results to be discussed in this talk, intuitionistic logic does not capture basic ideas of proof-theoretic semantics. Given the fact that a semantics should be primary over a syntactic specification of a logic, we observe that intuitionistic logic falls short of what is valid according to proof-theoretic semantics. The incompleteness of intuitionistic logic with respect to such a semantics therefore raises the question of whether there is an intermediate logic between intuitionistic and classical logic which is complete with respect to it.

References
[1] Piecha, Thomas, Wagner de Campos Sanz, and Peter Schroeder-Heister, 'Failure of completeness in proof-theoretic semantics', Journal of Philosophical Logic, 44 (2015), 321–335. https://doi.org/10.1007/s10992-014-9322-x.
[2] Piecha, Thomas and Peter Schroeder-Heister, Incompleteness of intuitionistic logic with respect to proof-theoretic semantics, Studia Logica 107 (2019). Available online at https://doi.org/10.1007/s11225-018-9823-7 and via Springer Nature SharedIt at https://rdcu.be/5dDs.
[3] Prawitz, Dag, 'Towards a foundation of a general proof theory', in P. Suppes et al. (eds), Logic, Methodology and Philosophy of Science IV, North-Holland, Amsterdam, 1973, pp. 225–250.
[4] Prawitz, Dag, 'An approach to general proof theory and a conjecture of a kind of completeness of intuitionistic logic revisited', in L. C. Pereira, E. H. Haeusler, and V. de Paiva, (eds), Advances in Natural Deduction, Springer 2014, pp. 269–279.

(Joint work with Peter Schroeder-Heister, University of Tübingen.)


(*) Paolo Pistone & Luca Tranchini (University of Tübingen): The calculus of higher-level rules in modern dress

Abstract: The "categorification" of lambda-calculi and natural deduction systems by means of higher-order categories is a recurrent theme in last years research. While most type systems can be presented as categories of finite order (e.g. 2-operads), it is well-known that Martin-Löf Type Theory generates, under certain hypotheses, a weak omega-category.
In this talk we develop the suggestion that Peter Schroeder-Heister's 1984 extension of natural deduction with higher-level rules (HLR) provides a second example of a category of infinite order. We propose a formulation of this system within the language of opetopes, one of the existing approaches to weak-omega categories. We show that this formulation of HLR allows for a cleaner description of higher-level substitution, and leads to a better understanding of normalization and identity of proof.


David Pym (UCL London): Reductive logic and proof-theoretic semantics

Abstract: While the deductive approach to logic begins with premisses and in step-by-step fashion applies proof rules to derive conclusions, the complementary reductive approach instead begins with a putative conclusion and searches for premisses sufficient for a legitimate derivation to exist by systematically reducing the space of possible proofs. Not only does this picture perhaps more closely resemble the way in which mathematicians actually prove theorems and, more generally, the way in which people solve problems using formal representations, it also encapsulates diverse applications of logic in computer science such as the programming paradigm known as logic programming, the proof-search problem at the heart of symbolic AI and automated theorem proving, precondition generation in program verification and more. It is also reflected at the level of truth-functional semantics - the perspective on logic utilized for the purpose of model-checking and thus verifying the correctness of industrial systems - wherein the truth value of a formula is calculated according to the truth values of its constituent parts. Developing the mathematical theory of reductive logic - proof theory, semantics, and control - for wide classes of substructural, modal, intuitionistic, and classical systems is a substantial ongoing project which is drawing upon not only the traditional tools of proof theory, model theory, and category theory but also upon the more recently developed theory of coalgebra and its applications in modelling stateful systems. Underlying this mathematical work is the basic observation that the spaces of objects that must be explored in reductive logic, either in the setting of proof-search or in the setting of model-checking, are bigger than the corresponding spaces of proofs or semantics realizers. This situation closely resembles that which gives rise to the structures used to set up concepts of validity in proof-theoretic semantics, and is used to support additional structure that is of use computationally.


Greg Restall (University of Melbourne): Isomorphisms in a Category of Proofs

Abstract: In this talk, I will show how a category of formulas and classical proofs can give rise to three different hyperintensional notions of sameness of content.
One of these notions is very fine-grained, going so far as to distinguish p and p∧p, while identifying other distinct pairs of formulas, such as p∧q and q∧p; p and ¬¬p; or ¬(p∧q) and ¬p∨¬q.
Another relation is more coarsely grained, and gives the same account of identity of content as equivalence in Angell’s logic of analytic containment. A third notion of sameness of content is defined, which is intermediate between Angell’s and Parry’s logics of analytic containment. Along the way we show how purely classical proof theory gives resources to define hyperintensional distinctions thought to be the domain of properly non-classical logics.

Extended abstract


Dave Ripley (Monash University): Consequence-theoretic semantics and entailment (unable to attend)

Abstract: The purpose of this talk is to present and begin to articulate an alternative to both model-theoretic and proof-theoretic semantics: consequence-theoretic semantics. On this approach, both proofs and models are mere technical auxiliaries, without any direct connection to meaning. Meaning is rather located in the consequence relation itself. The key theoretical notion I will use, borrowing from Greg Restall, is that of a position---a collection of assertions and denials. Some positions are ruled out, and others are not.
On this basis, a consequence-theoretic semantics, like any semantics, owes (among other things) an account of entailment. But while consequence is a matter of ruling out, entailment seems to be a matter of guaranteeing or ruling in; it's not immediately obvious how to use a position-theoretic framework to understand entailment. This talk shows how to do it, by developing a notion of equivalence between positions, and using this to understand when assertions and denials can be implicit in a position. Seen this way, entailment and consequence might have the same extension, or they might not; it depends on what structural features consequence itself has. I'll show a few examples of this dependence.

Extended abstract (including references)


Andrei Rodin (National Research University Higher School of Economics Moscow): Extra-logical proof-theoretic semantics in HoTT

Abstract: I. Kant famously argued that elementary geometrical statements such as Euclid’s Triangle Angle Sum theorem cannot be deduced from the first principles by purely logical means because their proofs require extra-logical geometrical constructions. The discovery of non-Euclidean geometries in the 19th century made Kant’s analysis of geometrical reasoning untenable in its original form, and throughout the following 20th century it was generally viewed as fundamentally mistaken or at least wholly outdated. However the recently emerged Homotopy Type theory (HoTT) and the related program of building new “univalent” foundations of mathematics provide a formal and conceptual basis for revising, once again, the epistemic role and logical function of extra-logical constructions in mathematical (and other) proofs.
The key feature of HoTT, which modifies the intended logical semantics of Martin-Löf Type theory (MLTT), is the homotopical cumulative hierarchy of types, which distinguishes between types of different homotopy levels. This hierarchy suggests a simple (albeit not uncontroversial) criterion of logicality according to which only types with at most one term qualify as propositions, and only applications of MLTT rules to propositions and their terms (proofs), that is, to judgements in the traditional sense of the term, qualify as logical inferences stricto sensu. According to the same criterion, applications of the same schematic rules to types and terms of higher homotopy levels are extra-logical constructions. Such extra-logical applications of deductive rules also have a proof-theoretic impact because the obtained non- propositional constructions serve as witnesses for associated propositions (formally obtained via the propositional truncation of higher types) in a manner similar to which elementary geometrical constructions support the traditional geometrical proofs. For this reason it is justified, in our view, to qualify the standard semantics of HoTT just outlined as proof-theoretic. Using HoTT as a motivating example I would like to discuss further the role of logical and extra-logical elements in formal proofs.


Tor Sandqvist (KTH Stockholm): Preservation of Structural Properties in Intuitionistic Extensions of an Inference Relation

Abstract: The paper approaches cut elimination from a new angle. On the basis of an arbitrary inference relation among logically atomic sentences, an inference relation on a language possessing logical operators is defined by means of inductive clauses similar to the operator-introducing rules of a cut-free intuitionistic sequent calculus. The logical terminology of the richer language is not uniquely specified, but assumed to satisfy certain conditions of a general nature, allowing for, but not requiring, the existence of infinite conjunctions and disjunctions. We investigate to what extent structural properties of the given atomic relation are preserved through the extension to the full language. While closure under the Cut rule narrowly construed is not in general thus preserved, two properties jointly amounting to closure under the ordinary structural rules, including Cut, are.
In the interest of conceptual economy, deducibility relations are specified purely inductively, rather than in terms of the existence of formal proofs of any kind.


Peter Schuster (University of Verona); joint work with Giulio Fellin & Daniel Wessel (both University of Verona): The Jacobson radical and Glivenko's Theorem

Abstract: Alongside the analogy between maximal ideals and complete theories, the Jacobson radical carries over from ideals of commutative rings to theories of propositional calculi. This prompts a variant of Lindenbaum's Lemma that relates classical validity and intuitionistic provability, and the syntactical counterpart of which is Glivenko's Theorem. Apart from shedding some more light on intermediate logics, this eventually prompts a non-trivial interpretation in logic of Rinaldi, Schuster and Wessel's conservation criterion for Scott-style entailment relations (BSL 2017 & Indag. Math. 2018).


Will Stafford (UC Irvine): Inquisitive and proof theoretic semantics

Abstract: When looking at the logic captured by proof-theoretic validity one must pay special attention to the sets of atomic rules considered. This talk will explore how the sets of atomic rules that are allowed can be modified to give a notion of proof-theoretic validity which is equivalent to inquisitive semantics. This offers a new perspective both on inquisitive semantics and the flexibility of proof-theoretic validity.


Kai Tanter (Monash University): An Inferentialist Semantics for Atomics, Predicates, and Names

Abstract: Inferentialism is a theory in the philosophy of language which claims that the meanings of expressions are constituted by inferential roles or relations, rather than truth and reference. Instead of a traditional model-theoretic semantics, infer- entialism naturally lends itself to a proof-theoretic semantics, where meaning is understood in terms of inference rules with a proof system. Most work in proof- theoretic semantics has focused on logical constants, with comparatively little work on the semantics of non-logical vocabulary. Drawing on Robert Brandom’s notion of material inference and Greg Restall’s bilateralist interpretation of the multiple conclusion sequent calculus, I present a proof-theoretic semantics for atomic sentences and their component names and predicates. The resulting system has several interesting features: (i) the rules for atomic sentences are determined by those for their component predicates and names; (ii) cut elimination for the system can be proved; (iii) model theoretic extensions can be interpreted as idealisations derived from the more fundamental inference rules; (iv) in contrast to other proof-theoretic approaches to non-logical vocabulary, the inferences rules are of a similar kind to those for logical vocabulary and priority is given to neither the introduction (right) nor elimination (left) rules.

Extended abstract (including references)


Bartosz Więckowski (Goethe University Frankfurt): Modes of Assumptions and Moods of Implications

Abstract: We define a natural deduction system which permits, depending on the canonical derivability of a formula in a given reference proof system, a factual or a counterfactual mode of assuming it. The system contains I/E-rules for factual and for counterfactual implication which make use of these modes. Derivations in this system normalize and normal derivations have the subformula property. In contrast to proof systems for counterfactual logics which result from internalizing possible worlds truth conditions, the system admits a foundationally autarkical proof-theoretic semantics for counterfactuals.

Extended abstract (including references)


(*) Ernst Zimmermann (University of Tübingen): Proof-theoretic Semantics for Natural Deduction based on Inversion

Abstract: The talk presents a full Proof-Theoretic Semantics for Natural Deduction based on a slightly modified inversion principle of Prawitz (D. Prawitz, Natural Deduction, 1965). The extension of Prawitz' inversion definition: the elimination rule for a connective q may invert the introduction rule for q, but also vice versa, the introduction rule for a connective q may invert the elimination rule for q. Such an extension of inversion, to be defined in detail, gives the following theorem:
INVERSION THEOREM
Inversion for 2 rules of connective q (intro rule, elim rule) exists iff a conversion of a maximum formula for q exists. For short: inversion equals conversion.
The inversion theorem is specified to two fragments of logics: Lambek Calculus, LC and Intuitionistic Linear Logic, ILL with 4 propositional connectives: 2 multiplicatives, implication and conjunction, 2 additives, conjunction and disjunction. LC is defined by using elimination rules by composition. ILL is defined by using general elimination rules.
The considerations show, that in LC and ILL for the four connectives exactly one of the pair intro rule, elim rule is inverting. Climbing up in the substructural hierarchy more structural rules are available, connective differences collapse, inversion gets arbitrary. For instance for intuitionistic conjunction and disjunction intro rule and elim rule are inverting - due to structural rules.

Extended abstract