Page personnelle en français (ce n'est pas une traduction de ma page anglaise, mais plutôt un complément portant sur des informations plus spécifiquement francophones.)
Directeur de Recherche
Département de Mathématiques
UMR 8553 du CNRS
École Normale Supérieure
bureau V1 (passage vert)
45 rue d'Ulm,
F-75 230 Paris Cedex 05,
tel: (33) (1) 44 32 33 87
fax: (33) (1) 44 32 20 80
followed by « at » ens.fr
Classic INRIA project team
A talk at IST Austria
PAC-Bayes bounds using Gaussian posterior distributions
Dimension dependent and dimension free PAC-Bayes bounds for the Gram matrix
Two talks in Toulouse (March 26, 2013)
Toric grammars, a new stochastic model
The statistics of Principal Component Analysis
Statistical learning of syntactic structures
Toric Grammars : a new statistical
approach to natural language modeling, Olivier Catoni and Thomas Mainguy (2013) arXiv
The simulation shown in this paper was made with the following code.
Please note that this is only a demonstration code,
suitable to show how the method behaves on small examples, but not
optimized to scale properly with large data sets.
A talk in Moscow, at the Institute for Information Transmission Problems (Nov 29, 2012)
Unsupervised statistical learning through label aggregation
A talk in Nice (on May 12, 2011, in French)
Petites perturbations des estimateurs et bornes PAC-Bayésiennes
Some lecture notes on PAC-Bayes bounds (Statistical learning, L3, ENS)
notes of 06/12/2013
notes of 04/02/2012
notes of 09/15/2011
My talk at ENS on March 16, 2011 (in French)
Apprentissage PAC-Bayésien : de la classification à la régression
My talk in Lille on January 21, 2011
La moyenne empirique est-elle perfectible ?
My last preprints are on arXiv (and HAL)
Challenging the empirical mean and empirical variance: a deviation study,
Olivier Catoni (2010), on arXiv
High confidence estimates of the mean of heavy-tailed real random variables,
Olivier Catoni (2009), on arXiv .
This one can be skipped : it is an early draft of the previous preprint, which presents improved
estimators, improved bounds and some experiments.
Robust linear least squares regression, Jean-Yves Audibert, Olivier Catoni
(2010), on arXiv
Robust linear regression through PAC-Bayesian truncation, Jean-Yves Audibert,
Olivier Catoni (2010), on arXiv
Risk bounds in linear regression through PAC-Bayesian truncation,
Jean-Yves Audibert, Olivier Catoni (2009), on
My talk in video.
This one can be skipped also: it is an early draft covering the
matter of the two previous preprints.
Exposé devant le Comité des Projets de l'INRIA ---
4 juin 2009
The slides of my talk to present the CLASSIC INRIA team proposal.
Evaluation du DMA --- 29 janvier 2009
The slides of my talk on the occasion of the evaluation of the
Journée de rentrée du DMA --- 2 octobre 2008
You can download the slides of
Univ. Rennes 1, June 18-20 2007. The slides of my talk, ``Learning,
information theory and thermodynamics'', pdf file.
PAC-Bayesian supervised classification (The thermodynamics of
This is the title of a monograph published in the Lecture Notes
series of the IMS. pdf file,
CV and report (in French)
I moved in October 1998 from ENS to Paris 6 and back to ENS in september 2008.
My older preprints and those
of some of my students
(Cécile Cot, Gilles Blanchard and Jean-Philippe Vert
who stayed at the ENS) can be found on the
preprint server of the
Laboratoire de Mathématiques de l'Ecole Normale Supérieure
Preprints from the period october 1998 - september 2008 are on the
server of the laboratoire de Probabilités et
Modèles Aléatoires., on the server
HAL or on ArXiv
You can also use the
national preprint search engine
of the cellule mathdoc.
last revision of ``The loop erased exit path and the metastability of
a biased vote process'', a joint paper with
Dayue Chen and Jun Xie, to appear in
Stochastic Processes and their Applications, is
for the last revision of ``Free energy estimates and deviation inequalities'',
with a more precise study of the unbounded case and improved bounds for Markov
The last revision of Gibbs estimators describes
general integrability conditions under which it is possible to define a Gibbs
estimator and to bound its risk.
You can download the last revision of my
on ``Simulated Annealing Algorithms and Markov chains with Rare
Transitions'', published in the Séminaire de Probabilités.
You can download here the draft of my Saint-Flour
lecture notes (July 2001) on statiscial learning theory and stochastic
optimization. The final version of these notes is now published
Springer Lecture Notes in Mathematics Number 1851.
Please consider buying the book or encouraging your
library to buy it if you liked the draft !
(as a courtesy to Springer's efforts to make the Saint Flour
summer school notes widely available: authors don't get royalties
on lecture notes, this is why I feel free to give you this piece
Information theory, statistical learning and pattern recognition
A workshop on this theme was held at the CIRM in December 1998.
The program of this meeting is kept here.
The Gibbs estimator in action : a downloadable software for density estimation
I wrote a software to illustrate a communication I presented
to Foundations of Computational Mathematics (July 13-17 2000, Hong-Kong).
You can have a look at its documentation here,
where you will also find download instructions.
The slides of my talk.
Empirical complexity and randomized estimators
You can download here the slides (as a .dvi or
.pdf file) of
my talk at the workshop
« Statistical Learning in Classification and Model Selection »
EURANDOM, Eindhoven, The Netherlands
January 15-18, 2003, organized by
Prof.dr. R.D.Gill (Universiteit Utrecht/EURANDOM), Dr. P. Grünwald (CWI), Prof.dr A.W. van der Vaart (Vrije Universiteit Amsterdam/EURANDOM),
Dr. J. Lember (EURANDOM)
as well as the corresponding preprint (as a .dvi or
(to be also available soon on the PMA server).
DEA lectures : Classification and model selection (2003)
Lecture notes in postscript and
pdf formats are available.
Théorèmes PAC Bayésiens locaux et
Those who understand french can download the slides of this
talk from my french homepage.
Back to the department homepage.