Titles and Abstracts

Arnon Avron: Using Assumptions in Gentzen-type Systems

Abstract: The consequence relation between formulas which is induced by a fully structural Gentzen-type system G is usually taken to be: Γ |–G ψ iff the sequent Γ ⇒ ψ is provable in G. However, no less useful is the consequence relation |-vG defined by: Γ |-vG ψ iff the sequent ⇒ ψ is derivable in G from the set of sequents {⇒ φ | φ ∈ Γ}. This is one particular case in which it is useful to infer a sequent from a set of assumptions where these assumptions are again sequents. We present several other examples of the usefulness of such inferences, including non-classical cases in which doing this is not only useful, but really essential for the expressive power of the logic. The main technical tool used in the various applications we present is a generalization of the usual cut-elimination theorem (which treats only assumptions-free derivations) to what we call the strong cut-elimination theorem (which applies also to derivations of sequents from other sequents). Abstract

Michel Bourdeau: Comte's « Théorie fondamentale des hypothèses »

Abstract: In spite of its importance (it triggered the interest for the topic in nineteenth century), Comte's theory of hypothesis has not received the attention it deserves and there has been no in-depth study of it since a paper by Laudan in 1971.
The argument takes its starting point in the fact that there is more in our ways of reasoning than induction and deduction. Science could not progress without scientists resorting to hypothesis, that is moving away from data, assuming some phenomenon, drawing consequences from it and asking if they agree or not with facts. Then, the question is: what kind of hypothesis is admissible, what kind is not?
Comte's answer is often dismissed as verificationist but there is much more to say about it. When he spoke about unverifiable hypothesis, he had in mind instances like phlogiston, calorific or "éther luminifère", which were still quite common in his time and he relied on the pioneer thermo dynamical works of Fourier, who precisely began by rejecting this very kind of hypothesis; but the theory has also to be understood in relation to his anti-metaphysical stance. The influence of Comte’s theory can be seen in the work of Duhem but mainly in Peirce’s theory of abduction.

Torben Braüner: Seligman-style deduction for hybrid modal logic

Abstract: A number of different sorts of proof-systems for modal logics are available: Major examples are labelled systems, display logics, multible sequents, and hybrid-logical systems. It is not the purpose of this talk to discuss the pros and cons of all these systems, but instead to focus on one particular sort of proof-system based on hybrid modal logic, which is an extension of ordinary modal logic allowing explicit reference to individual points in a Kripke model (the points stand for times, locations, states, possible worlds, or something else).
This sort of proof-system was put forward by Jerry Seligman in the late 1990s. One particular feature of this sort of system is the possibility to jump to a hypothetical time (or whatever the points stand for), do some reasoning, and then jump back to the present time again. In natural deduction versions the hypothetical reasoning is kept track of using machinery called explicit substitutions, similar to "proof-boxes" in the style of linear logic. Such a hybrid-logical proof-box encapsulates hypothetical reasoning taking place at one particular time. Within the proof-box, information depending on the hypothetical time can be dealt with, but only non-indexical information, that is, statements whose truth-values do not depend on the time, can flow in and out of the box.
In my talk I’ll present Seligman-style natural deduction, and I’ll in particular discuss the above mentioned machinery enabling hypothetical reasoning. I’ll also briefly describe two other lines of work in the area of Seligman-style reasoning:
1. Seligman-style natural deduction has turned out to be useful for formalizing so-called false-belief tests in cognitive psychology.
2. Recently developed Seligman-style tableau systems with desirable proof-theoretic properties. Extended abstract

Michael Cohen: Explanatory Justice: The Case of Disjunctive Explanations

Abstract: In recent years there has been an effort to explicate the concept of explanatory power in a Bayesian framework, by constructing explanatory measures. It has been argued that those measures should not violate the principle of explanatory justice, which states that explanatory power cannot be extended 'for free'. I argue, by formal means, that one recent measure that was claimed to be immune from explanatory injustice fails to be so. Thereafter, I propose a conceptual way of avoiding the counterintuitive side effects of the discussed formal results. I argue for another way of understanding the notion of 'extending explanatory success', for the cases of negative explanatory power. A consequence of this interpretation is that the original explanatory justice criticism disappears. Furthermore, some of the formal results derived from explanatory measures can be made conceptually clear under this kind of understanding.

Kosta Došen: An introduction to deduction

Abstract: To determine what deductions are it does not seem sufficient to know that the premises and conclusions are propositions, or something in the field of propositions, like commands and questions. It seems equally, if not more, important to know that deductions make structures, which in mathematics we find in categories, multicategories and polycategories. It seems also important to know that deductions should be members of particular kinds of families, which is what is meant by their being in accordance with rules.

Paul Egré: Negating indicative conditionals

Abstract: Negation is often viewed as a "litmus test" for theories of indicative conditionals in natural language (see Handley, Evans and Thompson 2006). Three main families of semantic theories compete when it comes to understanding the interaction of negation with conditional sentences, namely i) suppositional analyses inspired by the Ramsey Test, according to which the negation of "if P then Q" is of the form "If P then not Q"; ii) the two-valued truth-functional analysis according to which the negation should be a conjunction of the form "P and not Q" and iii) strict conditional analyses predicting a weak negation of the form "possibly P and not Q". We present the results of several experimental studies concerning the denial of indicative conditional sentences in dialogues, intended to show that all three forms of negations can be retrieved from the weak negation, depending on the additional assumptions available to the denier of the conditional.

(Joint work with Guy Politzer)

Nissim Francez and Julien Murzi: On a dogma of Proof-Theoretic Semantics: generalising canonicity of derivations

Abstract: Traditional, proof-theoretic semantics is entrenched in a dogma according to which operational rules of a logical constant (most often the I-rules) determine the meaning of the constant. As a result, proof reductions that show the balance between self-justified rules and other rules concern only I/E-rules. The paper claims that in certain cases, where Classical Logic is a prime example, structural rules also participate in meaning determination. As a result, more reductions are needed, involving the structural rules, to show that an ND-system qualifies as meaning conferring. The paper deals with two cases: Bilateral ND and multi-conclusion ND, and proposes reductions for them.

Andrzej Indrzejczak: Hypersequents and Linear Time

Abstract: Hypersequent calculi (HC) are a generalised form of sequent calculi invented independently by Pottinger and Avron. Hypersequents are usually defined as finite sets or multisets of ordinary sequents. HC proved to be very useful in the field of nonclassical logics including several modal, many-valued, relevant, and paraconsistent logics. We consider the application of HC to temporal logics of linear time. The first approach based on Avron’s rule for modelling linearity is sufficient for dealing with logics such as S4.3 or K4.3 where only future-looking operators are taken into account. In order to provide cut-free formalizations of bimodal logics such like Kt4.3 where past-looking operators are added we need a different solution. The second approach is based on hypersequents treated as finite sequences of sequents. This solution is more expressive in the sense that the symmetry of past and future in linear time may be modelled by suitable rules.

Reinhard Kahle: Axioms as Hypotheses

Abstract: In this talk we first discuss the shift in the understanding of the notion of Axiom as it was given by David Hilbert in the early 20th century. In the second part, we illustrate how the reading of Axioms as Hypotheses can give rise to a specific proof-theoretic semantics of non-logical axioms. While axioms are traditionally linked to the concepts of truth and evidence, modern mathematical logic allows them to be treated as hypotheses without previous ontological or epistemological commitments. This view was already prepared by the "discovery" of non-euclidean geometries which called the status of the parallel axiom into question; one may think also of the title of Riemann’s seminal habilitation talk: On the hypotheses which lie at the foundation of geometry. It was, however, Hilbert who not only defended this account in the so-called Frege-Hilbert controversy, but who also founded on it mathematical logic as we know it today and who thereby led the foundations for the structuralist view of mathematics which became dominant in the 20th century.
In 1922, Hilbert even suggested to add Riemann's Hypothesis as an axiom to a mathematical theory:

Nothing prevents us from taking as axioms propositions which are provable, or which we believe are provable. Indeed, as history shows, this procedure is perfectly in order: [...] Riemann's conjecture about the zeroes of ζ(s), [...]

As (to our knowledge of today, as well as in Hilbert's times) Riemann's hypothesis could be false, the question arises which status a mathematical theory would have if it incorporates Riemann's Hypothesis as axiom. We will argue that such a "hypothetical axiom system" can have a natural proof-theoretic semantics despite the fact that its consistency is unknown.

Sergey Melikhov: A joint logic of problems and propositions, a modified BHK-interpretation and proof-relevant topological models of intuitionistic logic

Abstract: In a 1985 commentary to his collected works, Kolmogorov remarked that his 1932 paper ("Zur Deutung der intuitionistischen Logik") "was written in hope that with time, the logic of solution of problems will become a permanent part of [a standard] course of logic. Creation of a unified logical apparatus dealing with objects of two types — propositions and problems — was intended." We describe such a formal system, QHC, which is a conservative extension of both the intuitionistic predicate calculus, QH, and the classical predicate calculus, QC. Moreover:

  • The only new connectives ? and ! of QHC induce a Galois connection (i.e., a pair of adjoint functors) between the Lindenbaum algebras of QH and QC, regarded as posets.
  • Kolmogorov’s double negation translation of propositions into problems extends to a retraction of QHC onto QH.
  • Gödel’s provability translation of problems into modal propositions extends to a retraction of QHC onto its QC+(?!) fragment, which can be identified with the modal logic QS4.

This leads to a new paradigm that views intuitionistic logic not as an alternative to classical logic that criminalizes some of its principles, but as an extension package that upgrades classical logic without removing it. The purpose of the upgrade is proof-relevance, or "categorification"; thus, if the classical conjunction and disjunction are to be modeled by intersection and union (of sets), the intuitionistic conjunction and disjunction will be modeled by product and disjoint union (of sheaves of sets). More formally, we describe a family of sheaf-valued models of QH, inspired by dependent type theory (not to be confused with the well-known open set-valued sheaf models of QH), which can be regarded as proof-relevant analogues of classical topological models of Tarski et al., and which extend to models of QHC. We also give an interpretation of some of these models in terms of stable solutions of algebraic equations; this can be seen as a proof-relevant counterpart of Novikov’s classical interpretation of some Tarski models in terms of weighting of masses.
The new paradigm also suggests a rethink of the Brouwer–Heyting–Kolmogorov interpretation of QH. Traditional ways to understand intuitionistic logic (semantically) have been rooted either in philosophy — with emphasis on the development of knowledge (Brouwer, Heyting, Kripke) or in computer science — with emphasis on effective operations (Kleene, Markov, Martin-Löf). Our "modified BHK interpretation" is rooted in the usual mathematical ideas of canonicity and stability; it emphasizes simply the order of quantifiers, developing Kolmogorov’s original approach. This interpretation is compatible with two complete classes models of QH: (i) Tarski topological models and (ii) set-theoretic models of Medvedev–Skvortsov and Läuchli; as well as with (iii) our sheaf-valued models.

Grigory Olkhovikov: Truth-value gaps and paradoxes of material implication

Abstract: One of the uses of multi-valued logic is to overcome some restrictions of the classical two-valued one. A particularly well-known set of restrictions is connected with analysis of conditionals in two-valued logic, where one finds so called paradoxes of material implication. One of these paradoxes is basically as follows: a conditional with false antecedent is a truth-value gap from a commonsense point of view, but is true in two-valued logic.
Resources of three-valued logic provide an obvious solution to this paradox: that is to say, interpret sentences exhibiting truth-value gap as having the third (non-designated) truth-value and define implication accordingly. Strange enough, this way out was not seriously tried until recently. In our talk we recapitulate the main results obtained thus far about a three-valued logic which realizes this kind of definition for implication together with natural definitions for disjunction, conjunction and quantifiers. We briefly address complete and consistent Hilbert-style axiomatizations for this logic (for both propositional fragment and full first-order version) and its expressive power.

Zoran Petrić: Cuts and Graphs

Abstract: Plural (or multiple-conclusion) cuts are inferences made by applying a structural rule introduced by Gentzen for his sequent formulation of classical logic. As singular (single-conclusion) cuts yield trees, which underlie ordinary natural deduction derivations, so plural cuts yield graphs of a more complicated kind, related to trees. The graphs of plural cuts are interesting in particular when the plural cuts are appropriate for sequent systems without the structural rule of permutation. The aim of this talk, which is based on a joint work with Kosta Došen, is to define these graphs both inductively and in a pure combinatorial manner.

Paolo Pistone: Rules, types and the transcendence of second order logic

Abstract: As soon as formulas of the form "there exists an X such that ..." or "for all integers n ..." are involved, the justification of logical rules by harmony (i.e. cut-elimination) looks like an empty shell: the intrinsic circularity of such arguments is indeed so harmful that we cannot, without accepting some preconceived assumptions, decide between "sound" and "paradoxical" versions of the Hauptsatz for higher order logics.
The aim of this talk is to argue that the acceptance of this fundamental form of ignorance (intimately related to the incompleteness theorems) makes room for a constructive understanding of second order logic.

Francesca Poggiolesi: Counterfactual logics: natural deduction calculi and sequent calculi

Abstract: Counterfactual logics, which have a long and venerable history, have been introduced to capture counterfactual sentences, i.e. conditionals of the form "if A were the case, then B would be the case", where A is false. If we interpret counterfactuals as material conditionals, we have that all counterfactuals are trivially true and this is an unpleasant conclusion. By means of counterfactual logics, on the other hand, we can give a different and meaningful interpretation of counterfactual sentences.
There are several different systems of counterfactual logics. Amongst them we focus on the system CK and its standard extensions, namely CK + {ID, MP, CEM}. These systems have a simple and useful semantics. One just needs to consider a set of possible worlds W, and a selection function f; for each world i and each formula A, f selects the set of worlds of W which are closer to i given the information A. Thus a counterfactual sentence A > B is true at a world i if, and only if, B is true at all those worlds that are closer to i given the information A.
In this talk we aim at presenting a method for generating sequent calculi and natural deduction calculi for the system CK and its extensions. The method is based on and fully exploits the simple semantics interpretation of such systems. Sequent calculi and natural deduction calculi are proved to be equivalent; moreover, as for the natural deduction calculi, we prove that the derivations normalize; while, as for the sequent calculi, we prove they are contraction-free, weakening-free and cut-free and that their logical rules are all invertible. Extended abstract

Erdinç Sayan: How do vacuous truths become laws?

Abstract: Idealizations and approximations used in science are assumptions or hypotheses which are known to be false. In this paper I examine the problems of testing pertaining to idealizational laws, which are law statements whose antecedents contain idealizational clauses. I argue that the major theories of confirmation, such as Popperian falsificationism and Bayesianism, face difficulties in accounting for the confirmation and disconfirmation of such laws. Underlying their difficulties is the fact that those laws are about nonexistent objects. I propose an analysis of the meaning and the confirmation conditions for idealizational laws.
Consider the simple conception that scientific laws are unrestricted universal material-conditional statements of the form "For all x, Fx → Gx". If we assume this interpretation of laws, the following problem arises. If no object instantiates the antecedent, as when F refers to the property, for example, of being a point mass or a frictionless plane, the law is true "vacuously." But of course we cannot be content with our laws being true vacuously.
The analysis I propose of the testing of idealizational laws is based on the fact that idealized objects comprising the domains of such laws can be approximated by real objects. Extended abstract

Guillaume Schlaepfer: Scientific modeling: a two layer based hypothetical reasoning

Abstract: The framework of my talk is scientific explanation, in relation with critics addressed to the deductive-nomologic (DN) model of explanation (Hempel 1964). The critics are the "relevancy" problem, according to which some laws allow to deduce particular events without explaining them, and the "special sciences" problem, which takes advantage of the view that special sciences, like biology, have no natural laws, although performing explanations, hence the unsuitability of a nomologic account.

The discussion is based on developments about the semantics of ceteris paribus laws, which, instead of being universal laws, are preceded by the semantically opaque negative condition "if nothing else impinges, then ...". Those laws are widespread among the special sciences, although not qualifying as natural laws since they lack universality. I will be interested in the dispositional view of ceteris paribus laws, which considers ceteris paribus laws as particular compositions of universal laws based on causal dispositions or capacities (Cartwright 1989). The consequence for hypothetical reasoning is a two-layers based logic: dispositional laws (with properties as argument) and compositional rules (with laws as argument). The idea is to draw attention to some implications of this logical structure and use it as a possible solution to the problems mentioned above.