Skip to Main content Skip to Navigation
Theses

Modèles et normalisation des preuves

Denis Cousineau 1, 2
2 TYPICAL - Types, Logic and computing
LIX - Laboratoire d'informatique de l'École polytechnique [Palaiseau], Inria Saclay - Ile de France
Abstract : The notion of theory split from the notion of logics in the late 1920's when Hilbert and Ackermann distinguished between deduction rules, which do not depend on the ob ject of the speech, and axioms, which depend on it. Then arose the question of how to characterize theories - defined by a set of axioms - which can be used to formalize mathematical reasoning. A first criterion is consistency : the fact that one cannot prove all propositions of a theory. However, it gradually appeared that this criterion was not a sufficient property. Witness and disjunction properties for constructive proofs, and completeness of certain methods of automated deduction are not entailed by the consistency of a theory. But those properties are all consequences of a single property : proof normalization. In 1930, Gödel's completeness theorem exhibited the fact that consistency can be defined by different means. It can be defined internally to the theory, as we have seen, but il can also be defined in an algebraic way : a theory is consistent if and only if it has a model. The equivalence between those two définitions is a fundamental tool that allowed to prove consistency of many theories : set theory with negation of the axiom of choice by Mostovski and Fraenkel, set theory with the axiom of choice and the continuum hypothesis by Gödel, set theory with negation of the continuum hypothesis by Cohen, . . . On the contrary, proof normalization property seemed to be only definable internally to the theory. Some model-theory inspired criteria have been used to prove proof normalization for certain theories but those criteria were not proved to be necessary. We propose in this thesis a algebraic criterion both necessary and sufficient for proof normalization. We show this way that proof normalization property can also be defined as a algebraic criterion like consistency property. For that purpose, we define a new notion of truth values algebras (TVA) called language dependent truth values algebras (LDTVAS). The notion of TVA allows to exhibit the algebra of reducibility candidates (the notion defined by Girard in 1970). The existence of a model valued on this algebra defines a algebraic sufficient criterion for proof normalization. We then define a refinement of the notion of reducibility candidates as one of those LDTVAS and prove that the existence of a model valued on this algebra defines a algebraic still sufficient but also necessary criterion for proof normalization. This criterion is defined for the logical frameworks of minimal deduction modulo and λΠ-calculus modulo. We finally exhibit the strength of λΠ-calculus modulo by showing that all functional Pure Type Systems can be embedded in it.
Document type :
Theses
Complete list of metadatas

Cited literature [48 references]  Display  Hide  Download

https://pastel.archives-ouvertes.fr/tel-00433165
Contributor : Denis Cousineau <>
Submitted on : Wednesday, November 18, 2009 - 12:48:54 PM
Last modification on : Wednesday, March 27, 2019 - 4:41:28 PM
Long-term archiving on: : Tuesday, October 16, 2012 - 2:25:08 PM

Identifiers

  • HAL Id : tel-00433165, version 1

Collections

Citation

Denis Cousineau. Modèles et normalisation des preuves. Informatique [cs]. Ecole Polytechnique X, 2009. Français. ⟨tel-00433165⟩

Share

Metrics

Record views

540

Files downloads

381