Logik und Sprachtheorie

Research – Personal

Significant papers of the past 20 years together with abstracts. For further information including bibliographic details and DOIs see the list of publications

 

Prawitz’s Completeness Conjecture: A Reassessment

To appear in a special issue of Theoria entitled "Dag Prawitz and Per Martin-Löf -- Rolf Schock laureates" (edited by Peter Pagin and Dag Westerståhl). It is based on papers presented at the symposium "Validity of Inference" celebrating the award of the Rolf Schock Prize in Logic and Philosophy to Per Martin-Löf and Dag Prawitz (Stockholm, 26 October 2022).

In 1973 Dag Prawitz conjectured that the calculus of intuitionistic logic is complete with respect to his notion of validity of arguments. On the background of the recent disproof of this conjecture by Piecha, de Campos Sanz and Schroeder-Heister, we discuss possible strategies of saving Prawitz's intentions. We argue that Prawitz's original semantics, which is based on the principal frame of all atomic systems, should be replaced with a general semantics, which also takes into account restricted frames of atomic systems. We discard the option of not considering extensions of atomic systems, but acknowledge the need to incorporate definitional atomic bases in the semantic framework. It turns out that ideas and results by Westerst\aa hl on the Carnap categoricity of intuitionistic logic can be applied to Prawitz semantics. This implies that Prawitz semantics  has a status of its own as a genuine, though incomplete, semantics of intuitionstic logic. An interesting side result is the fact that every formula satisfiable in general semantics is satisfiable in an axioms-only frame (a frame whose atomic systems do not contain proper rules). We draw a parallel between this seemingly paradoxical result and Skolem's paradox in first-order model theory.

 
Proof-Theoretic Semantics: An Autobiographical Survey

In: Thomas Piecha & Kai Wehmeier (eds), Peter Schroeder-Heister on Proof-Theoretic Semantics, Outstanding Contributions to Logic 29, Springer 2024, 1-51. DOI: 10.1007/978-3-031-50981-0_1.

In this autobiographical sketch, which is followed by a bibliography of my writings, I try to relate my intellectual development to problems, ideas and results in proof-theoretic semantics on which I have worked and to which I have contributed.
 

Comments on the contributions

In: Thomas Piecha & Kai Wehmeier (eds), Peter Schroeder-Heister on Proof-Theoretic Semantics, Outstanding Contributions to Logic 29, Springer 2024, 443-455. DOI: 10.1007/978-3-031-50981-0_17.

The contributions to this volume represent a broad range of aspects of proof-theoretic semantics. Some do so in the narrower, and some in the wider sense of the term. Some deal with issues I have been concerned with directly, and some tackle further problems. All of them open interesting new perspectives and develop the field in different directions. I will briefly comment on the significance of each contribution here.

 
The Logical Writings of Karl Popper

(edited jointly with David Binder and Thomas Piecha)

Springer 2022. Trends in Logic series. 

This open access book is the first ever collection of Karl Popper's writings on deductive logic. 

Karl R. Popper (1902-1994) was one of the most influential philosophers of the 20th century. His philosophy of science ("falsificationism") and his social and political philosophy ("open society") have been widely discussed way beyond academic philosophy. What is not so well known is that Popper also produced a considerable work on the foundations of deductive logic, most of it published at the end of the 1940s as articles at scattered places. This little-known work deserves to be known better, as it is highly significant for modern proof-theoretic semantics.

This collection assembles Popper's published writings on deductive logic in a single volume, together with all reviews of these papers. It also contains a large amount of unpublished material from the Popper Archives, including Popper's correspondence related to deductive logic and manuscripts that were (almost) finished, but did not reach the publication stage. All of these items are critically edited with additional comments by the editors. A general introduction puts Popper's work into the context of current discussions on the foundations of logic. This book should be of interest to logicians, philosophers, and anybody concerned with Popper's work.

 
 Popper’s theory of deductive logic

(jointly with D. Binder and T. Piecha)

In: David Binder; Thomas Piecha and Peter Schroeder-Heister (eds.), The Logical Writings of Karl Popper. Berlin: Springer, pp. 1–79. DOI: 10.1007/978-3-030-94926-6_1.

We present Popper’s theory of deductive logic as exhibited in his articles published between 1947 and 1949. After an introduction to Popper’s inferentialist approach and his idea of “inferential definitions” of logical constants, we discuss in more formal detail Popper’s general theory of the deducibility relation (which, using Gentzen’s terminology, might be called his “structural” theory), as well as his special theory of logical constants. We put special emphasis on his inferential notion of duality, which includes his analysis of the “anti-conditional” (today called “co-implication”), his systematic study of various forms of negation including classical, intuitionistic and weaker negations, his system of bi-intuitionistic logic, and his theory of quantification and identity. We also touch on his treatment of modal operators, which is based on Carnap’s Meaning and Necessity.

 

Popper's correspondence with Peter Schroeder-Heister

In: David Binder; Thomas Piecha and Peter Schroeder-Heister (eds.), The Logical Writings of Karl Popper. Berlin: Springer, pp. 509-522. DOI: 10.1007/978-3-030-94926-6_32.

Shortly after his Dr. phil. at the University of Bonn in 1981 Peter Schroeder-Heister hit upon Popper’s logical writings and realized that the inferentialism advocated by Popper bears significant relationships to the idea of proof-theoretic semantics. Subsequently, in 1982, he wrote a paper on Popper’s theory of deductive inference stressing in particular the issue of demarcation of logical from extra-logical signs (published 1984). Sending a draft of this paper to Popper led to the correspondence reproduced here.

 

Intensional proof-theoretic semantics and the rule of contraction

In: Logically Speaking: A Festschrift for Marie Duží. Ed. by Pavel Materna and Bjørn Jespersen. London: College Publications 2022

The structural rule of contraction, which allows the identification of two occurrences of the same formula, is discussed from the viewpoint of intensional proof-theoretic semantics. It is argued that sometimes such occurrences have different meanings and therefore should not be identified. This suggests a restriction of contraction based on a definitional (and thus intensional) ordering of formulas within proofs, which is related to ramification in type theories such as the one proposed by Marie Duží.

 
Axiomatic Thinking, Identity of Proofs and the Quest for an Intensional Proof-Theoretic Semantics

In:  F. Ferreira, R. Kahle, and G. Sommaruga (Eds.), Axiomatic Thinking I, Springer 2022. 

Starting from Hilbert's Axiomatic Thinking, the problem of identity of proofs and its significance is discussed in an elementary proof-theoretic setting. Identifying two proofs, one of which is obtained from the other one by removing redundancies, leads, when used as a universal method, to a collapse of all proofs of a provable proposition into one single proof and thus trivialises proof identity. Principles of proof-theoretic harmony with restricted means of redundancy reduction might be used instead, though this limits one to a certain format of formal systems. The discussion of proof identity suggests the claim that annotations of proofs telling the reader which rule is applied at a particular step, must be considered part of the proof itself. As a general perspective, it is advocated that the investigation of intensional aspects of proofs should be given more space in proof theory and proof-theoretic semantics.

 
How to Ekman a Crabbé-Tennant

(jointly with L. Tranchini)

Synthese 199 (2021), 617-639.  https://doi.org/10.1007/s11229-018-02018-3 

Developing early results of Prawitz, Tennant proposed a criterion for an expression to count as a paradox in the framework of Gentzen’s natural deduction: paradoxical expressions give rise to non-normalizing derivations. Two distinct kinds of cases, going back to Crabbé and Tennant, show that the criterion overgenerates, that is, there are derivations which are intuitively non-paradoxical but which fail to normalize. Tennant’s proposed solution consists in reformulating natural deduction elimination rules in general (or parallelized) form. Developing intuitions of Ekman we show that the adoption of general rules has the consequence of hiding redundancies within derivations. Once reductions to get rid of the hidden redundancies are devised, it is clear that the adoption of general elimination rules offers no remedy to the overgeneration of the Prawitz–Tennant analysis. In this way, we indirectly provide further support for a solution to one of the two overgeneration cases developed in previous work.

 
Die Grundlagen mathematischer Beweise. Philosophisch-logische Überlegungen

In: Mitteilungen der Mathematischen Gesellschaft in Hamburg 38 (2018), 117-137

After a brief sketch of the origins of proof theory and of Hilbert's idea to regard proofs as mathematical objects, the problem of identity of proofs is discussed from the point of view of general proof theory and proof-theoretic semantics. Fundamental problems associated with the 'redundancy view', according to which proofs remain the same when obvious redundancies are removed, are pointed out. The formulation of a proper notion of harmony between logical rules is considered a way out of these problems. It is also emphasized that the annotation of logical (and extra-logical) rules applied in a proof is essential for the proof itself and thus a crucial part of it. This idea leads to type-theoretic conceptions of logical and mathematical reasoning.

 

Incompleteness of Intuitionistic Propositional Logic with Respect to Proof-Theoretic Semantics

(jointly with T. Piecha)

In T. Piecha and P. Schroeder-Heister (eds), Special issue on General Proof Theory, Studia Logica 107(1), 233-246, 2019.   https://doi.org/10.1007/s11225-018-9823-7 

Prawitz (1973, 2014) proposed certain notions of proof-theoretic validity and conjectured that intuitionistic logic is complete for them. Considering propositional logic, we present a general framework of five abstract conditions which any proof-theoretic semantics should obey. Then we formulate several more specific conditions under which the intuitionistic propositional calculus (IPC) turns out to be semantically incomplete. Here a crucial role is played by the generalized disjunction principle. Turning to concrete semantics, we show that prominent proposals, including Prawitz’s, satisfy at least one of these conditions, thus rendering IPC semantically incomplete for them. Only for Goldfarb’s (2016) proof-theoretic semantics, which deviates from standard approaches, IPC turns out to be complete. Overall, these results show that basic ideas of proof-theoretic semantics for propositional logic are not captured by IPC.

 

General Proof Theory: Introduction

(jointly with T. Piecha)

In T. Piecha and P. Schroeder-Heister (eds), Special issue on General Proof Theory, Studia Logica 107(1), 1-5, 2019.  https://doi.org/10.1007/s11225-018-9818-4

This special issue on general proof theory collects papers resulting from the conference on general proof theory held in November 2015 in Tübingen.

 
Intuitionistic logic is not complete for standard proof-theoretic semantics

(jointly with T. Piecha)

Abstract for the ASL Logic Colloquium 2017, Stockholm, 14-20 August. Bulletin of Symbolic Logic 24 (2018), 262.

 

Ekman's Paradox

(jointly with L. Tranchini)

Notre Dame Journal of Formal Logic 58, 2017. https://doi.org/10.1215/00294527-2017-0017

Prawitz observed that Russell’s paradox in naive set theory yields a derivation of absurdity whose reduction sequence loops. Building on this observation, and based on numerous examples, Tennant claimed that this looping feature, or more generally, the fact that derivations of absurdity do not normalize, is characteristic of the paradoxes. Striking results by Ekman show that looping reduction sequences are already obtained in minimal propositional logic, when certain reduction steps, which are prima facie plausible, are considered in addition to the standard ones. This shows that the notion of reduction is in need of clarification. Referring to the notion of identity of proofs in general proof theory, we argue that reduction steps should not merely remove redundancies, but must respect the identity of proofs. Consequentially, we propose to modify Tennant’s paradoxicality test by basing it on this refined notion of reduction.

 

The Definitional View of Atomic Systems in Proof-Theoretic Semantics

(jointly with T. Piecha)

In: P. Arazim & T. Lávička (eds), The Logica Yearbook 2016, College Publications, London, 2017, pp. 185-200.

Atomic systems, that is, sets of rules containing only atomic formulas, play an important role in proof-theoretic notions of logical validity. We consider a view of atomic systems as definitions that allows us to discuss a proposal made by Prawitz (2016). The implementation of this view in the base case of an inductive definition of validity leads to the problem that derivability of atomic formulas in an atomic system does not coincide with the validity of these formulas. This is due to the fact that, on the definitional view of atomic systems, there are not just production rules, but both introduction and elimination rules for atoms, which may even generate non-normalizable atomic derivations. This shows that the way atomic systems are handled is a fundamental issue of proof-theoretic semantics.

 

Restricting initial sequents: The trade-offs between identity, contraction and cut

In: R. Kahle, T. Strahm & T. Studer (eds), Advances in Proof Theory, Progress in Computer Science and Applied Logic 28, pp. 339-351, Birkhäuser 2016.

In logical sequent calculi, initial sequents expressing the axiom of identity can be required to be atomic, without affecting the deductive strength of the system. When extending the logical system with right- and left-introduction rules for atomic formulas, one can analogously require that initial sequents be restricted to "uratoms", which are undefined (not even vacuously defined) atoms. Depending on the definitional clauses for atoms, the resulting system is possibly weaker than the unrestricted one. This weaker system may however be preferable to the unrestricted system, as it enjoys cut elimination and blocks unwanted derivations arising from non-wellfounded definitions, for example in the context of paradoxes.

 

Atomic Systems in Proof-Theoretic Semantics: Two Approaches

(jointly with T. Piecha)

In: J. Redmond, O. Pombo Martins and Á. Nepomuceno Fernández (eds), Epistemology, Knowledge and the Impact of Interaction, Logic, Epistemology, and the Unity of Science, vol. 38, pp. 47-62, Springer 2016. 

Atomic systems are systems of rules containing only atomic formulas. In proof-theoretic semantics for minimal and intuitionistic logic they are used as the base case in an inductive definition of validity. We compare two different approaches to atomic systems. The first approach is compatible with an interpretation of atomic systems as representations of states of knowledge. The second takes atomic systems to be definitions of atomic formulas. The two views lead to different notions of derivability for atomic formulas, and consequently to different notions of proof-theoretic validity. In the first approach, validity is stable in the sense that for atomic formulas logical consequence and derivability coincide for any given atomic system. In the second approach this is not the case. This indicates that atomic systems as definitions, which determine the meaning of atomic sentences, might not be the proper basis for proof-theoretic validity, or conversely, that standard notions of proof-theoretic validity are not appropriate for definitional rule systems.

 

General Proof Theory. Celebrating 50 Years of Dag Prawitz's "Natural Deduction". Proceedings of the Conference held in Tübingen, 27-29 November 2015

(ed. jointly with T. Piecha)

https://doi.org/10.15496/publikation-10394, University of Tübingen 2016.

General proof theory studies how proofs are structured and how they relate to each other, and not primarily what can be proved in particular formal systems. It has been developed within the framework of Gentzen-style proof theory, as well as in categorial proof theory. As Dag Prawitz's monograph "Natural Deduction" (1965) paved the way for this development (he also proposed the term "General Proof Theory"), it is most appropriate to use this topic to celebrate 50 years of this work. The conference took place 27-29 November, 2015 in Tübingen at the Department of Philosophy. The proceedings collect abstracts, slides and papers of the presentations given, as well as contributions from two speakers who were unable to attend.

 

Proof-theoretic validity based on elimination rules

In: E. H. Haeusler, W. de Campos Sanz & B. Lopes (eds), Why is this a Proof? Festschrift for Luiz Carlos Pereira, College Publications, London 2015.

In the tradition of Dummett-Prawitz-style proof-theoretic semantics, which considers validity of derivations or proofs as one of its core notions, this paper sketches an approach to proof-theoretic validity based on elimination rules and assesses its merits and limitations. Some remarks are made on alternative approaches based on the idea of dualizing connectives and proofs, as well as on definitional reflection using elimination clauses.

 

Advances in Proof-Theoretic Semantics

(ed. jointly with T. Piecha)

Trends in Logic 43, Springer 2016.

This volume is the first ever collection devoted to the field of proof-theoretic semantics. Contributions address topics including the systematics of introduction and elimination rules and proofs of normalization, the categorial characterization of deductions, the relation between Heyting's and Gentzen's approaches to meaning, knowability paradoxes, proof-theoretic foundations of set theory, Dummett's justification of logical laws, Kreisel's theory of constructions, paradoxical reasoning, and the defence of model theory.

The field of proof-theoretic semantics has existed for almost 50 years, but the term itself was proposed by Schroeder-Heister in the 1980s. Proof-theoretic semantics explains the meaning of linguistic expressions in general and of logical constants in particular in terms of the notion of proof. This volume emerges from presentations at the Second International Conference on Proof-Theoretic Semantics in Tübingen in 2013, where contributing authors were asked to provide a self-contained description and analysis of a significant research question in this area. The contributions are representative of the field and should be of interest to logicians, philosophers, and mathematicians alike.

Introduction

 

Open problems in proof-theoretic semantics

In: T. Piecha & P. Schroeder-Heister (eds), Advances in Proof-Theoretic Semantics, Trends in Logic 43, Springer 2016, pp. 253-283.

I present three open problems the discussion and solution of which I consider relevant for the further development of proof-theoretic semantics: (1) The nature of hypotheses and the problem of the appropriate format of proofs, (2) the problem of a satisfactory notion of proof-theoretic harmony, and (3) the problem of extending methods of proof-theoretic semantics beyond logic.

 

Logic, Methodology and Philosophy of Science. Logic and Science Facing the New Technologies. Proceedings of the 14th International Congress (Nancy)

(ed. jointly with G. Heinzmann, W. Hodges and P. E. Bour)

College Publications, London 2014. Table of contents and preface.

The 14th International Congress of Logic, Methodology and Philosophy of Science was held on July 19 - 26, 2011 in Nancy, the historic capital of Lorraine and the birthplace of Henri Poincaré. For the first time in LMPS history, the Nancy congress had a special topic: Logic and Science Facing the New Technologies. These Proceedings include state of the art discussions by leading scholars.

Besides plenary talks, they contain many of the invited papers from the four sections: Logic, General Philosophy of Science, Methodological and Philosophical Issues of Particular Sciences, and Methodological and Philosophical Issues in Technology. Further papers result from colloquia on quantum information, on the notion of algorithm and on mathematics in relation to the new technologies.
 

Proceedings of the Conference on Hypothetical Reasoning, 23-24 August 2014, Tübingen

(ed. jointly with T. Piecha)

https://doi.org/10.15496/publikation-415, Universität Tübingen, 2015.

Hypothetical reasoning or reasoning under assumptions is a key concept of logic, philosophy of science and mathematics. The Conference on Hypothetical Reasoning focussed on its logical aspects, such as assumption-based calculi and their proof theory, logical consequence from a proof-theoretic or model-theoretic point of view, logics of conditionals, proof systems, structure of assumption-based proofs, hypotheses in proof-theoretic semantics, notions of implication, substructural logics, hypotheses in categorial logic, logical aspects of scientific explanation, hypothetical reasoning in mathematics and reasoning from definitions and axioms. The conference took place 23–24 August, 2014 in Tübingen at the Department of Philosophy, in conjunction with ESSLLI 2014. The proceedings collect abstracts, slides and papers of the presentations given.

 

Frege’s sequent calculus

In: A. Indrzejczak, J. Kaczmarek, M. Zawidzki (eds.), Trends in Logic XIII: Gentzen's and Jaśkowski's heritage - 80 years of natural deduction and sequent calculi, Łódź University Press 2014, pp. 233-245.

Frege’s logical system as developed in the "Grundgesetze der Arithmetik" can be regarded as a sequent calculus of a specific form.

 

Proof-theoretic harmony and the levels of rules: Generalised non-flattening results

(jointly with G. K. Olkhovikov)

In E. Moriconi & L. Tesconi (Eds.), Second Pisa Colloquium in Logic, Language and Epistemology, ETS, 2014, pp. 245-287.

If we generate elimination from introduction rules, or, conversely, introduction rules from elimination rules according to a general pattern, we often observe a rise in level: To introduction rules that are just production rules, there correspond elimination rules that discharge assumptions, and vice versa. In a previous publication we showed that this situation cannot always be avoided, i.e., that elimination and introduction rules cannot always be 'flattened'. More precisely, we showed that there are connectives with given introduction rules which do not have corresponding elimination rules in standard natural deduction, and vice versa. In this paper we generalise this result: Even if we allow for rules of higher levels, i.e. rules that may discharge rules used as assumptions, the level rise is often necessary. For every level n we can find a connective with introduction rules of level n, whose corresponding elimination rules must at least have level n+1, and a connective with elimination rules of level n, whose corresponding introduction rules must at least have level n+1.

 

Dialogical Logic for Definitional Reasoning and Implications as Rules

(jointly with T. Piecha)

In J. Mittelstraß & C. von Bülow (eds), Dialogische Logik, pp. 91-125, Mentis.

 

The Calculus of Higher-Level Rules, Propositional Quantification, and the Foundational Approach to Proof-Theoretic Harmony

In: Studia Logica, Special issue, ed. A. Indrzejczak, "Gentzen's and Jaśkowski's Heritage: 80 Years of Natural Deduction and Sequent Calculi", 102 (2014), 1185-1216..
A condensed version of this article appeared as

Proof-theoretic harmony: the issue of propositional quantification
in: A. Indrzejczak, J. Kaczmarek, M. Zawidzki (eds.), Trends in Logic XIII: Gentzen's and Jaśkowski's heritage - 80 years of natural deduction and sequent calculi, Łódź University Press 2014, 5-16.

We present our calculus of higher-level rules, extended with propositional quantification within rules. This makes it possible to present general schemas for introduction and elimination rules for arbitrary propositional operators and to define what it means that introductions and eliminations are in harmony with each other. This definition does not presuppose any logical system, but is formulated in terms of rules themselves. We therefore speak of a foundational (rather than reductive) account of proof-theoretic harmony. With every set of introduction rules a canonical elimination rule, and with every set of elimination rules a canonical introduction rule is associated in such a way that the canonical rule is in harmony with the set of rules it is associated with. An example given by Hazen and Pelletier is used to demonstrate that there are significant connectives, which are characterized by their elimination rules, and whose introduction rule is the canonical introduction rule associated with these elimination rules. Due to the availabiliy of higher-level rules and propositional quantification, the means of expression of the framework developed are sufficient to ensure that the construction of canonical elimination or introduction rules is always possible and does not lead out of this framework.

 

Failure of completeness in proof-theoretic semantics

(jointly with Thomas Piecha and Wagner de Campos Sanz)

In: Journal of Philosophical Logic 44 (2015), 321-335. First published online August 1, 2014.

Several proof-theoretic notions of validity have been proposed in the literature, for which completeness of intuitionistic logic has been conjectured. We define validity for intuitionistic propositional logic in a way which is common to many of these notions, emphasizing that an appropriate notion of validity must be closed under substitution. In this definition we consider atomic systems whose rules are not only production rules, but may include rules that allow one to discharge assumptions. Our central result shows that Harrop’s rule is valid under substitution, which refutes the completeness conjecture for intuitionistic logic.

 

Harmony in proof-theoretic semantics: A reductive analysis

In:  H. Wansing (ed.), Dag Prawitz on Proofs and Meaning, Springer 2014

We distinguish between the foundational analysis of logical constants, which treats all connectives in a single general framework, and the reductive analysis, which studies general connectives in terms of the standard operators. With every list of introduction or elimination rules proposed for an n-ary connective c, we associate a certain formula of second-order propositional logic. The formula corresponding to given introduction rules expresses the introduction meaning, the formula corresponding to given elimination rules the elimination meaning of c. We say that introduction and elimination rules for c are in harmony with each other when introduction meaning and elimination meaning match. Introduction or elimination rules are called flat, if they can discharge only formulas, but not rules as assumptions. We can show that not every connective with flat introduction rules has harmonious flat elimination rules, and conversely, that not every connective with flat elimination rules has harmonious flat introduction rules. If a harmonious characterisation of a connective is given, it can be explicitly defined in terms of the standard operators for implication, conjunction, disjunction, falsum and (propositional) universal quantification, namely by its introduction meaning or (equivalently) by its elimination meaning. It is argued that the reductive analysis of logical constants implicitly underlies Prawitz's (1979) proposal for a general schema for introduction and elimination rules.
 

On Flattening Elimination Rules

(jointly with Grigory K. Olkhovikov)

Review of Symbolic Logic 7(1), 60-72, 2014.

In proof-theoretic semantics of intuitionistic logic it is well-known that elimination rules can be generated from introduction rules in a uniform way. If introduction rules discharge assumptions, the corresponding elimination rule is a rule of higher level, which allows one to discharge rules occurring as assumptions. In some cases these uniformly generated elimination rules can be equivalently replaced with elimination rules that only discharge formulas or do not discharge any assumption at all — they can be flattened in a terminology proposed by Read. We show by an example from propositional logic that not all introduction rules have flat elimination rules. We translate the general form of flat elimination rules into a formula of second-order propositional logic and demonstrate that our example is not equivalent to any such formula. The proof uses elementary techniques from propositional logic and Kripke semantics.

 

Constructive semantics, admissibility of rules and the validity of Peirce's law

(jointly with Wagner de Campos Sanz and Thomas Piecha)

Logic Journal of the IGPL (2013), doi:10.1093/jigpal/jzt029.

In his approach to proof-theoretic semantics, Sandqvist claims to provide a justification of classical logic without using the principle of bivalence. Following ideas by Prawitz, his semantics relies on the idea that logical systems extend atomic systems, so-called "bases", with respect to which the validity of logically complex formulas is defined. We relate this approach to admissibility-based semantics and show that the latter significantly differs from the former. We also relate it to semantics based on the notion of construction, in which case the results obtained are essentially the same as Sandqvist's. We argue that the form of rules admitted in atomic bases determines which logical rules are validated, as is the fact of whether bases are conceived as information states, which can be monotonely extended, or as non-extensible inductive definitions. This shows that the format of atomic bases is a highly relevant issue in proof-theoretic semantics.

 

Paradoxes and Structural Rules

In: C. Dutilh Novaes & O. T. Hjortland (eds.), Insolubles and Consequences: Essays in honour of Stephen Read. London: College Publications 2012, 203-211.

The derivation of many paradoxes can be blocked, if the application of structural rules is locally restricted in certain ways. This is shown independently for identity, contraction, and cut. Imposing such local restrictions is seen as an alternative to the global rejection of structural rules (notably contraction), which is no reasonable option given that structural rules are needed in mathematical reasoning.

 

Proof-Theoretic semantics, self-contradiction, and the format of deductive reasoning

In: L. Tranchini (ed.), Anti-Realistic Notions of Truth, Special issue of Topoi vol. 31 no. 1, 2012.

From the point of view of proof-theoretic semantics, it is argued that the sequent calculus with introduction rules on the assertion and on the assumption side represents deductive reasoning more appropriately than natural deduction. In taking consequence to be conceptually prior to truth, it can cope with non-wellfounded phenomena such as contradictory reasoning. The fact that in its typed variant, the sequent calculus has an explicit and separable substitution schema in form of the cut rule, is seen as a crucial advantage over natural deduction, where substitution is built into the general framework.

 

Implications as Rules in Dialogical Semantics

(jointly with Thomas Piecha)

In: M. Peliš and V. Punčochář (eds.), The Logica Yearbook 2011, College Publications, London, 2012, 211-225.

The conception of implications as rules is interpreted in Lorenzen-style dialogical semantics. Implications-as-rules are given attack and defense principles, which are asymmetric between proponent and opponent. Whereas on the proponent's side, these principles have the usual form, on the opponent's side implications function as database entries that can be used by the proponent to defend assertions independent of their logical form. The resulting system, which also comprises a principle of cut, is equivalent to the sequent-style system for implications-as-rules. It is argued that the asymmetries arising in the dialogical setting are not deficiencies but reflect the `pre-logical' structural character of the notion of rule.

 

Proof-Theoretic Semantics (SEP Entry)

In: Stanford Encyclopedia of Philosophy (Ed Zalta, ed.), forthcoming URL = http://plato.stanford.edu/archives/win2012/entries/proof-theoretic-semantics/.

 

The categorical and the hypothetical: A critique of some fundamental assumptions of standard semantics

In: Sten Lindström, Erik Palmgren and Dag Westerståhl (eds.), The Philosophy of Logical Consequence and Inference. Special Issue of Synthese 187, 925-942, 2012.

The hypothetical notion of consequence is normally understood as the transmission of a categorical notion from premisses to conclusion. In model-theoretic semantics this categorical notion is `truth', in standard proof-theoretic semantics it is `canonical provability'. Three underlying dogmas, (1) the priority of the categorical over the hypothetical, (2) the transmission view of consequence, and (3) the identification of consequence and correctness of inference are criticized from an alternative view of proof-theoretic semantics. It is argued that consequence is a basic semantical concept which is directly governed by elementary reasoning principles such as definitional closure and definitional reflection, and not reduced to a canonical concept. This understanding of consequence allows in particular to deal with non-wellfounded phenomena as they arise from circular definitions.

 

Definitional Reasoning in Proof-Theoretic Semantics and the Square of Opposition

In: J.-Y. Béziau and G. Payette (eds), New Perspectives on the Square of Opposition, Peter Lang, Bern, 2012.

Within a framework of reasoning with respect to clausal definitions of atoms, four forms of judgement are distinguished: Direct and indirect assertion, and direct and indirect denial. Whereas direct assertion and direct denial are established by directly applying a definitional clause ("definitional closure"), indirect assertion and indirect denial result from showing that all possible premisses of the opposite judgement can be refuted ("definitional reflection"). The deductive relationships between these four forms of judgement correspond to those represented in the square of opposition, if direct assertion and direct denial are placed in the A and E corners, and indirect assertion and indirect denial in the I and O corners of the square.

 

An alternative implication-left schema for the sequent calculus

Abstract for the ASL Logic Colloquium 2010, Paris, 25-31 July.

 

Implications-as-Rules vs. Implications-as-Links: An Alternative Implication-Left Schema for the Sequent Calculus

Journal of Philosophical Logic 40 (2011), 95-101.

The interpretation of implications as rules motivates a different left-introduction schema for implication in the sequent calculus, which is conceptually more basic than the implication-left schema proposed by Gentzen. Corresponding to results obtained for systems with higher-level rules, it enjoys the subformula property and cut elimination in a weak form.

 

Generalized elimination inferences, higher-level rules, and the implications-as-rules interpretation of the sequent calculus

Contribution to: Edward Hermann Haeusler, Luiz Carlos Pereira and Valeria de Paiva, editors, Advances in Natural Deduction, Springer 2014, 1-29.

We investigate the significance of higher-level generalized elimination rules as proposed by the author in relation to standard-level generalized elimination rules as proposed by Dyckhoff, Tennant, López-Escobar and von Plato. Many of the results established for natural deduction with higher-level rules such as normalization and the subformula principle immediately translate to the standard-level case. The sequent-style interpretation of higher-level natural deduction as proposed by Avron and by the author leads to a system with a weak rule of cut, which enjoys the subformula property. The interpretation of implications as rules motivates a different left-introduction schema for implication in the ordinary (standard-level) sequent calculus, which conceptually is more basic than the implication-left schema proposed by Gentzen. Corresponding to the result for the higher-level system, it enjoys the subformula property and cut elimination in a weak form.

 

Definitional Reflection and Basic Logic

APAL special issue "Advances in Constructive Topology and Logical Foundations" in honor of the 60th birthday of Giovanni Sambin (Maria Emilia Maietti, Erik Palmgren and Michael Rathjen, eds.). 164 (2013), 491-501

In their Basic Logic, Sambin, Battilotti and Faggian give a foundation of logical inference rules by reference to certain reflection principles. We investigate the relationship between these principles and the principle of Definitional Reflection proposed by Hallnäs and Schroeder-Heister.

 

Schluß und Umkehrschluß. Ein Beitrag zur Definitionstheorie

Erscheint in revidierter Form in: Carl Friedrich Gethmann (Hrsg.), Lebenswelt und Wissenschaft. Akten des XXI. Deutschen Kongresses für Philosophie (Essen, 15.-19. September 2008). Deutsches Jahrbuch Philosophie, Bd. 2, 2011. Hamburg: Meiner, 1065-1092.

Diese Arbeit untersucht das Problem der Umkehrbarkeit von Schlüssen im definitorischen Kontext. Wenn eine induktive Definition in Form einer Liste von Klauseln gegeben ist, dann kann man aufgrund der Abgeschlossenheit dieser Definition ("Extremalklausel", "Nichts sonst ist ein A") definitorische Regeln umkehren. Im ersten Teil wird das Lorenzensche Inversionsprinzip als Prototyp der Umkehrung der Definitionsrichtung diskutiert. Der zweite Abschnitt untersucht die lokalisierte Fassung der Inversion in Form der definitorischen Reflexion. Abschnitt 3 zeigt, wie Umkehrbarkeit sich auch auf negative Kontexte erweitern läßt, wie sich Inversions- und Reflexionsprinzipien auch in bezug auf Negationen formulieren lassen. In Abschnitt 4 wird die Dualisierung von logischen Begrifflichkeiten erwähnt und insbesondere auf terminologische Probleme im Zusammenhang mit Theorien der Co-Implikation hingewiesen. Abschnitt 5 skizziert, wie sich Zulässigkeit und definitorische Reflexion selbst dualisieren lassen. In Abschnitt 6 wird kurz auf die Option verwiesen, Schluß und Umkehrschluß im Rahmen der Dialoglogik aus einem einheitlichen Prinzip zu gewinnen.

 

Sequent Calculi and Bidirectional Natural Deduction: On the Proper Basis of Proof-Theoretic Semantics

In: M. Peliš (ed.), The Logica Yearbook 2008, London: College Publications 2009.

This paper argues that the sequent calculus or alternatively bidirectional natural deduction should be chosen as the basis for proof-theoretic semantics. Here "bidirectional natural deduction" is proposed as a term for a natural deduction system with generalized elimination inferences, in which major premisses of elimination inferences always occur in top position (i.e., are assumptions). It can be viewed as a sequent calculus in natural deduction format. When interpreted semantically, such systems promise to overcome certain shortcomings of proof-theoretic semantics in the Dummett-Prawitz tradition, which is based on the standard unidirectional model of natural deduction. In particular, they do more justice to the notion of "assumption". Assumptions are now treated on par with assertions, and open assumptions are no longer merely placeholders for closed proofs. Such an approach permits a smooth extension of proof-theoretic semantics beyond the realm of logical constants towards a general definitional theory of hypothetical reasoning.

 

Lorenzen's operative justification of intuitionistic logic

In: M. van Atten, P. Boldini, M. Bourdeau, G. Heinzmann (eds.), One Hundred Years of Intuitionism (1907-2007), Basel: Birkhäuser 2008.

We analyse the logic-oriented part of Lorenzen's "Introduction to Operative Logic and Mathematics" (1955) and reconstruct his theory of implication. It turns out that his much criticized doctrine of iterated admissibility statements and meta-calculi can be given a sensible form. Furthermore, we compare Lorenzen's theory with proof-theoretic semantics in the Dummett-Prawitz tradition, and show that both approaches can be translated into one another. It is argued that, from the viewpoint of an epistemological approach to semantics, the Dummett-Prawitz approach (as well as Lorenzen's later 'dialogical' or game-theoretical semantics) has certain advantages over Lorenzen's operative theory, if it is expected that knowledge of meaning should manifest itself in observable behaviour.

 

Nicholas Rescher on Greek Philosophy and the Syllogism

(jointly with Jürgen Mittelstraß)

In: R. Almeder (ed.), Rescher Studies (Festschrift 80th Birthday), Ontos Verlag.

This paper comments on Rescher's Cosmos and Logos - Studies in Greek Philosophy and Galen and the Syllogism, pointing out the significance of Greek and Islamic sources for Rescher's approach to philosophy. In the more technical sections, it relates Rescher's edition of a manuscript of Al-Salah to corresponding work carried out by A.I. Sabra. Furthermore, it gives a critical evaluation of Rescher's systematic account of categorical syllogistics and especially of the relevance (or irrelevance) of the fourth figure. Finally, it attempts to reconstruct Rescher's interpretation of the ekthesis principles of Aristotle's modal syllogistics.

 

Proof-Theoretic versus Model-Theoretic Consequence

In: M. Peliš (ed.), The Logica Yearbook 2007, Prague: Filosofia 2008, 187-200.

Both the model-theoretic notion of consequence in Tarski's sense and standard proof-theoretic notions of consequence in the BHK tradition and in the tradition of Dummett-Prawitz-style proof-theoretic semantics are defined in terms of a categorical concept (truth, validity, construction). It is argued that a proper proof-theoretic semantics should address the hypothetical concept of consequence directly, based on the sequent calculus as the formal model of reasoning. Inference rules are then justified by principles of definitional closure and definitional reflection with respect to a given clausal definition.

 

Generalized Definitional Reflection and the Inversion Principle

Logica Universalis 1 (2007), 355-376.

The term inversion principle goes back to Lorenzen who coined it in the early 1950s. It was later used by Prawitz and others to describe the symmetric relationship between introduction and elimination inferences in natural deduction, sometimes also called harmony. Lorenzen's inversion principle has a much wider range than Prawitz's adaptation to natural deduction, dealing with the invertibility of rules of an arbitrary atomic production system. It is closely related to definitional reflection, which is a principle for reasoning on the basis of rule-based atomic definitions as well. After presenting definitional reflection and the inversion principle, it is shown that the inversion principle can be formally derived from definitional reflection, when the latter is viewed as a principle to establish admissibility. Furthermore, the relationship between defiitional reflection and the inversion principle is investigated on the background of a universalization principle, called the ω-principle, which allows one to pass from the set of all defined substitution instances of a sequent to the sequent itself.

 

Lorenzens operative Logik und moderne beweistheoretische Semantik

In: J. Mittelstraß (ed.), Der Konstruktivismus in der Philosophie im Ausgang von Wilhelm Kamlah und Paul Lorenzen. Paderborn: Mentis 2007, 167-196.

Der Logik-Teil von Lorenzens "Einführung in die operative Logik und Mathematik" (1955) enthält Ansätze, die eine neue Diskussion, Einschätzung und Würdigung vom Standpunkt der modernen beweistheoretischen Semantik (seit 1971) verdienen. Es zeigt sich, daß Lorenzens Theorie einen signifikanten Schritt über die konstruktive Semantik vom BHK-Typ hinaus darstellt, der manche Ideen der beweistheoretischen Semantik vorwegnimmt. Außerdem enthält sie, aufgrund ihrer sehr allgemein gefaßten, auf beliebige Produktionssysteme ausgerichteten Konzeption Elemente, die auf dem Hintergrund der beweistheoretischen Semantik und der Anregungen aus der Theorie der Logikprogrammierung weiterentwickelt werden sollten. Anders als in Lorenzens eigener Einschätzung seiner konstruktiven Logikbegründung sollte sie also nicht nur als Zwischenstufe auf dem Weg zu einer dialogischen oder spieltheoretischen Semantik verstanden werden.

 

For further references (including downloadable items) see the list of publications.