H. Akaike, Information theory and an extension of the maximum likelihood principle, Proceedings of the 2 nd International Symposium on Information Theory, pp.267-281, 1973.

L. Birgé and P. Massart, Gaussian model selection, Journal of the European Mathematical Society, vol.3, issue.3, pp.203-268, 2001.
DOI : 10.1007/s100970100031

L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classification and regression trees, 1984.

B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani, Least angle regression, Annals of Statistics, vol.32, pp.407-499, 2004.

E. Greenshtein, R. , and Y. , Persistence in high-dimensional linear predictor selection and the virtue of overparametrization, Bernoulli, vol.10, issue.6, pp.971-988, 2004.
DOI : 10.3150/bj/1106314846

S. Keerthi and S. Shevade, A Fast Tracking Algorithm for Generalized LARS/LASSO, IEEE Transactions on Neural Networks, vol.18, issue.6, 2006.
DOI : 10.1109/TNN.2007.900229

K. Knight and W. Fu, Asymptotics for LASSO-type estimators, Annals of Statistics, vol.28, issue.5, pp.1356-1378, 2000.

C. L. Mallows, Some comments on Cp, Technometrics, vol.15, pp.661-675, 1973.

M. Y. Park and T. Hastie, -regularization path algorithm for generalized linear models, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.67, issue.4, 2006.
DOI : 10.1073/pnas.082099299

URL : https://hal.archives-ouvertes.fr/hal-00458708

S. Rosset and J. Zhu, Piecewise linear regularized solution paths, The Annals of Statistics, vol.35, issue.3, 2004.
DOI : 10.1214/009053606000001370

G. Schwarz, Estimating the Dimension of a Model, The Annals of Statistics, vol.6, issue.2, pp.461-464, 1978.
DOI : 10.1214/aos/1176344136

R. Tibshirani, Regression Shrinkage and Selection via the Lasso, Journal of the Royal Statistical Society B, vol.58, pp.229-243, 1996.

P. Zhao, Y. , and B. , On Model Selection Consistency of Lasso, Statistics Department, 2006.

H. Zou, T. Hastie, and R. Tibshirani, On the ???degrees of freedom??? of the lasso, The Annals of Statistics, vol.35, issue.5, 2004.
DOI : 10.1214/009053607000000127

P. J. Bickel, Y. Ritov, and A. Tsybakov, Simultaneous analysis of Lasso and Dantzig selector, The Annals of Statistics, vol.37, issue.4, 2008.
DOI : 10.1214/08-AOS620

URL : https://hal.archives-ouvertes.fr/hal-00401585

S. Boyd and L. Vandenberghe, Convex Optimization, 2004.

F. Bunea, A. Tsybakov, and M. Wegkamp, Sparsity oracle inequalities for the Lasso, Electronic Journal of Statistics, vol.1, issue.0, pp.169-194, 2007.
DOI : 10.1214/07-EJS008

URL : https://hal.archives-ouvertes.fr/hal-00160646

B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani, Least Angle Regression, Annals of Statistics, vol.32, pp.407-499, 2004.

J. Germain, A Two-steps Model Selection Procedure Based on the Regularization Path of a L 1 -Penalized Logistic Likelihood, Proceedings of SFdS, 2007.

E. Greenshtein and Y. Ritov, Persistence in high-dimensional linear predictor selection and the virtue of overparametrization, Bernoulli, vol.10, issue.6, pp.971-988, 2004.
DOI : 10.3150/bj/1106314846

S. J. Haberman, Concavity and Estimation, The Annals of Statistics, vol.17, issue.4, pp.1631-1661, 1989.
DOI : 10.1214/aos/1176347385

A. E. Hoerl and R. W. Kennard, Ridge Regression: Biased Estimation for Nonorthogonal Problems, Technometrics, vol.24, issue.1, pp.55-67, 1970.
DOI : 10.2307/1909769

P. J. Huber, The behavior of maximum likelihood estimates under nonstandard conditions, Proc. Fifth Berkeley Sympos, pp.221-233, 1965.

J. K. Kim and D. Pollard, Cube Root Asymptotics, The Annals of Statistics, vol.18, issue.1, pp.191-219, 1990.
DOI : 10.1214/aos/1176347498

K. Knight and W. Fu, Asymptotics for LASSO-Type Estimators, Annals of Statistics, vol.28, issue.5, pp.1356-1378, 2000.

W. Niemiro, Asymptotics for $M$-Estimators Defined by Convex Minimization, The Annals of Statistics, vol.20, issue.3, pp.1514-1533, 1992.
DOI : 10.1214/aos/1176348782

M. Y. Park and T. Hastie, -regularization path algorithm for generalized linear models, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.67, issue.4, pp.659-677, 2007.
DOI : 10.1073/pnas.082099299

URL : https://hal.archives-ouvertes.fr/hal-00458708

D. Pollard, New Ways to Prove Central Limit Theorems, Econometric Theory, vol.1, issue.03, pp.295-313, 1985.
DOI : 10.1214/aoms/1177703732

R. Tyrrell-rockafellar, Convex analysis. Princeton Mathematical Series, 1970.

R. Tibshirani, Regression Schrinkage and Selection via the LASSO, J. Royal. Statist. Soc., B, issue.58, pp.229-243, 1996.

A. W. Van and . Vaart, Asymptotic Statistics, 1998.

A. W. Van-der-vaart and J. A. Wellner, Weak convergence and empirical processes, With applications to statistics, 1996.
DOI : 10.1007/978-1-4757-2545-2

P. Zhao and B. Yu, On model selection consistency of Lasso, J. Mach. Learn. Res, vol.7, pp.2541-2563, 2006.

H. Zou, T. Hastie, and R. Tibshirani, On the ???degrees of freedom??? of the lasso, The Annals of Statistics, vol.35, issue.5, pp.2173-2192, 2007.
DOI : 10.1214/009053607000000127

R. Dream-dtaa, T. Guyancourt, G. , F. Com, I. Telecom et al., E-mail address: jean-francois.germain@renault, RUE BARRAULT, vol.46, pp.75634-75647

]. N. Bibliographie and . Ansaldi, Contributions des méthodes statistiquesàstatistiques`statistiquesà la quantification de l'agrément de conduite, Thèse de Doctorat, 2002.

P. J. Bickel, Y. Ritov, and A. Tsybakov, Simultaneous analysis of Lasso and Dantzig selector, The Annals of Statistics, vol.37, issue.4, 2008.
DOI : 10.1214/08-AOS620

URL : https://hal.archives-ouvertes.fr/hal-00401585

L. Birgé and P. Massart, Gaussian model selection, Journal of the European Mathematical Society, vol.3, issue.3, pp.203-268, 2001.
DOI : 10.1007/s100970100031

L. Birgé and P. Massart, Minimal penalties for Gaussian model selection. Probab. Theory Related Fields, pp.33-73, 2007.

S. Boyd and L. Vandenberghe, Convex optimization, 2004.

L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classification and regression trees, 1984.

F. Bunea, A. Tsybakov, and M. Wegkamp, Sparsity oracle inequalities for the Lasso, Electronic Journal of Statistics, vol.1, issue.0, pp.169-194, 2007.
DOI : 10.1214/07-EJS008

URL : https://hal.archives-ouvertes.fr/hal-00160646

F. Cailliez and J. Pagès, IntroductionàIntroduction`Introductionà l'analyse de données. SMASH, p.616, 1976.

E. J. Candès and T. Tao, Decoding by Linear Programming, IEEE Transactions on Information Theory, vol.51, issue.12, pp.4203-4215, 2005.
DOI : 10.1109/TIT.2005.858979

E. J. Candès and T. Tao, The Dantzig selector: Statistical estimation when p is much larger than n, The Annals of Statistics, vol.35, issue.6, pp.2313-2351, 2007.
DOI : 10.1214/009053606000001523

A. P. Dempster, N. M. Laird, and D. B. Rubin, Maximum likelihood from incomplete data via the EM algorithm, J. Roy. Statist. Soc. Ser. B, vol.39, issue.1, pp.1-38, 1977.

D. L. Donoho, M. Elad, and V. N. Temlyakov, Stable recovery of sparse overcomplete representations in the presence of noise, IEEE Transactions on Information Theory, vol.52, issue.1, pp.6-18, 2006.
DOI : 10.1109/TIT.2005.860430

L. David, I. M. Donoho, and . Johnstone, Ideal spatial adaptation by wavelet shrinkage, Biometrika, vol.81, issue.3, pp.425-455, 1994.

I. E. Frank and J. H. Friedman, A Statistical View of Some Chemometrics Regression Tools, Technometrics, vol.5, issue.2, pp.109-148, 1993.
DOI : 10.1080/00401706.1993.10485033

N. Freed and F. Glover, Applications and Implementation., Decision Sciences, vol.9, issue.1, pp.68-74, 1981.
DOI : 10.2307/2282146

P. Geladi and B. Kowalski, Partial least-squares regression: a tutorial, Analytica Chimica Acta, vol.185, pp.1-17, 1986.
DOI : 10.1016/0003-2670(86)80028-9

J. Germain, A Two-steps Model Selection Procedure Based on the Regularization Path of a L 1 -Penalized Logistic Likelihood, Proceedings of SFdS, 2007.

C. J. Geyer, On the Asymptotics of Convex Stochastic Estimation, 1996.

E. Greenshtein and Y. Ritov, Persistence in high-dimensional linear predictor selection and the virtue of overparametrization, Bernoulli, vol.10, issue.6, pp.971-988, 2004.
DOI : 10.3150/bj/1106314846

S. J. Haberman, Concavity and Estimation, The Annals of Statistics, vol.17, issue.4, pp.1631-1661, 1989.
DOI : 10.1214/aos/1176347385

URL : http://projecteuclid.org/download/pdf_1/euclid.aos/1176347385

T. Hastie, R. Tibshirani, and J. Friedman, The elements of statistical learning, Data mining, inference, and prediction, 2001.

N. L. Hjort and D. Pollard, Asymptotics for Minimisers of Convex Processes, 1993.

A. E. Hoerl and R. W. Kennard, Ridge Regression: Biased Estimation for Nonorthogonal Problems, Technometrics, vol.24, issue.1, pp.55-67, 1970.
DOI : 10.2307/1909769

P. J. Huber, The behavior of maximum likelihood estimates under nonstandard conditions, Proc. Fifth Berkeley Sympos, pp.221-233, 1965.

E. A. Jaochimsthaler and A. Stam, Mathematical Programming Approaches for the Classification Problem in Two-Group Discriminant Analysis, Multivariate Behavioral Research, vol.25, issue.4
DOI : 10.1207/s15327906mbr2504_2

S. Keerthi and S. Shevade, A Fast Tracking Algorithm for Generalized Linear, IEEE transactions on Neural Networks, 2006.

J. K. Kim and D. Pollard, Cube Root Asymptotics, The Annals of Statistics, vol.18, issue.1, pp.191-219, 1990.
DOI : 10.1214/aos/1176347498

K. Knight and W. Fu, Asymptotics for lasso-type estimators, Ann. Statist, vol.28, issue.5, pp.1356-1378, 2000.

C. Leng, Y. Lin, and G. Wahba, A note on the lasso and related procedures in model selection, Statist. Sinica, vol.16, issue.4, pp.1273-1284, 2006.

R. D. Luce, Individual choice behavior : A theoretical analysis, 1959.
DOI : 10.1037/14396-000

C. L. Mallows, Some Comments on Cp, Technometrics, vol.15, pp.661-675, 1973.

W. Niemiro, Asymptotics for $M$-Estimators Defined by Convex Minimization, The Annals of Statistics, vol.20, issue.3
DOI : 10.1214/aos/1176348782

M. R. Osborne, B. Presnell, and B. A. Turlach, Knot Selection for Regression Splines via the Lasso, Computing Science and Statistics, vol.30, pp.44-49, 1998.

M. R. Osborne, B. Presnell, and B. A. Turlach, On the LASSO and its dual, J. Comput. Graph. Statist, vol.9, issue.2, pp.319-337, 2000.

M. Y. Park and T. Hastie, -regularization path algorithm for generalized linear models, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.67, issue.4, pp.659-677, 2007.
DOI : 10.1073/pnas.082099299

URL : https://hal.archives-ouvertes.fr/hal-00458708

J. Poggi and C. Tuleau, Classification of objectivization data using CART and wavelets, Proceedings, 2007.

D. Pollard, New Ways to Prove Central Limit Theorems, Econometric Theory, vol.1, issue.03, pp.295-313, 1985.
DOI : 10.1214/aoms/1177703732

R. T. Rockafellar, Convex analysis. Princeton Mathematical Series, 1970.

S. Rosset and J. Zhu, Piecewise linear regularized solution paths, The Annals of Statistics, vol.35, issue.3, pp.1012-1030, 2007.
DOI : 10.1214/009053606000001370

URL : http://arxiv.org/abs/0708.2197

M. Sauvé, Sélection de modèles en régression non gaussienne Applicationsà Applications`Applicationsà la sélection de variables et aux tests de survie accélérés, Thèse de Doctorat, 2006.

G. Schwarz, Estimating the Dimension of a Model, The Annals of Statistics, vol.6, issue.2, pp.461-464, 1978.
DOI : 10.1214/aos/1176344136

J. A. Swets and R. M. Pickett, Evaluation of Diagnostic Systems : Methods from Signal Detection Theory, 1982.

M. Tenenhaus, A PLS approach to multiple table analysis In Classification, clustering, and data mining applications, Stud. Classification Data Anal, Knowledge Organ, pp.607-620, 2004.

M. Tenenhaus, La régression logistique PLS, Modèles statistiques pour données qualitatives, pp.263-276, 2005.

R. Tibshirani, Regression shrinkage and selection via the lasso, J. Roy. Statist. Soc. Ser. B, vol.58, issue.1, pp.267-288, 1996.

K. E. Train, Discrete choice methods with simulation, 2003.
DOI : 10.1017/cbo9780511805271

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.455.8654

A. W. Van and . Vaart, Asymptotic Statistics, 1998.

A. W. Van-der-vaart and J. A. Wellner, Weak convergence and empirical processes. Springer Series in Statistics, 1996.

V. Vapnik, Estimation of dependences based on empirical data Springer Series in Statistics, 1982.

H. Wold, Estimation of principal components and related models by iterative least squares, Multivariate Analysis (Proc. Internat. Sympos, pp.391-420, 1965.

P. Zhao and B. Yu, On model selection consistency of Lasso, J. Mach. Learn. Res, vol.7, pp.2541-2563, 2006.

H. Zou, T. Hastie, and R. Tibshirani, On the ???degrees of freedom??? of the lasso, The Annals of Statistics, vol.35, issue.5, pp.2173-2192, 2007.
DOI : 10.1214/009053607000000127