D. Amelunxen, M. Lotz, M. B. Mccoy, and J. A. Tropp, Living on the edge: phase transitions in convex programs with random data, Information and Inference, vol.3, issue.3, pp.224-294, 2014.
DOI : 10.1093/imaiai/iau005

S. Arlot, R. Francis, and . Bach, Data-driven calibration of linear estimators with minimal penalties, Advances in Neural Information Processing Systems, pp.46-54, 2009.
URL : https://hal.archives-ouvertes.fr/hal-00414774

J. Audibert, No fast exponential deviation inequalities for the progressive mixture rule. arXiv preprint math/0703848, 2007.
URL : https://hal.archives-ouvertes.fr/hal-00139018

J. Audibert, Fast learning rates in statistical inference through aggregation, The Annals of Statistics, vol.37, issue.4, pp.1591-164608, 2009.
DOI : 10.1214/08-AOS623

URL : https://hal.archives-ouvertes.fr/hal-00139030

F. Balabdaoui and J. A. Wellner, Estimation of a k -monotone density: Limit distribution theory and the spline connection, The Annals of Statistics, vol.35, issue.6, pp.2536-2564, 2007.
DOI : 10.1214/009053607000000262

URL : https://hal.archives-ouvertes.fr/hal-00363240

F. Balabdaoui and J. A. Wellner, -monotone density: characterizations, consistency and minimax lower bounds, Statistica Neerlandica, vol.3, issue.1, pp.45-70, 2010.
DOI : 10.1111/j.1467-9574.2009.00438.x

URL : https://hal.archives-ouvertes.fr/hal-00701836

M. Banerjee and J. A. Wellner, Likelihood ratio tests for monotone functions, Ann. Statist, vol.29, issue.6, pp.1699-1731, 2001.

Y. Baraud, C. Giraud, and S. Huet, Estimator selection in the Gaussian setting, Annales de l'Institut Henri Poincar??, Probabilit??s et Statistiques, vol.50, issue.3, pp.1092-111913, 2014.
DOI : 10.1214/13-AIHP539

URL : https://hal.archives-ouvertes.fr/hal-00502156

L. Peter, S. Bartlett, and . Mendelson, Empirical minimization. Probab. Theory Related Fields, pp.311-334, 2006.

C. Pierre and . Bellec, Optimal bounds for aggregation of affine estimators, 2014.

C. Pierre and . Bellec, Optimal exponential bounds for aggregation of density estimators, 2014.

C. Pierre and . Bellec, Sharp oracle inequalities for least squares estimators in shape restricted regression, p.2015

C. Pierre, A. B. Bellec, and . Tsybakov, Sharp oracle bounds for monotone and convex regression through aggregation, J. Mach. Learn. Res, vol.16, pp.1879-1892

A. Belloni and V. Chernozhukov, Least squares after model selection in high-dimensional sparse models, Bernoulli, vol.19, issue.2, pp.521-547, 2013.
DOI : 10.3150/11-BEJ410SUPP

A. Belloni, V. Chernozhukov, and L. Wang, Pivotal estimation via square-root Lasso in nonparametric regression, The Annals of Statistics, vol.42, issue.2, pp.757-78814, 2014.
DOI : 10.1214/14-AOS1204SUPP

J. Peter, . Bickel, A. B. Ritov, and . Tsybakov, Simultaneous analysis of lasso and Dantzig selector, Ann. Statist, vol.37, issue.4, pp.1705-173208, 2009.

L. Birgé and P. Massart, Gaussian model selection, Journal of the European Mathematical Society, vol.3, issue.3, pp.203-268, 2001.
DOI : 10.1007/s100970100031

L. Birgé and P. Massart, Minimal Penalties for Gaussian Model Selection, Probability Theory and Related Fields, vol.6, issue.1-2, pp.33-73, 2007.
DOI : 10.1007/s00440-006-0011-8

S. Boucheron, G. Lugosi, and P. Massart, Concentration Inequalities, 2013.
DOI : 10.1007/978-1-4757-2440-0

URL : https://hal.archives-ouvertes.fr/hal-00751496

S. Boyd and L. Vandenberghe, Convex optimization, 2009.

T. , T. Cai, and M. G. Low, Adaptive confidence balls, Ann. Statist, vol.34, issue.1, pp.202-228, 2006.

T. , T. Cai, M. G. Low, and Y. Xia, Adaptive confidence intervals for regression functions under shape constraints, Ann. Statist, vol.41, issue.2, pp.722-75012, 2013.

E. Candes and T. Tao, The Dantzig selector: Statistical estimation when p is much larger than n, The Annals of Statistics, vol.35, issue.6, pp.2313-2351, 2007.
DOI : 10.1214/009053606000001523

O. Catoni, Introduction, Lecture Notes in Mathematics, vol.1851, pp.1-269, 2001.
DOI : 10.1007/978-3-540-44507-4_1

V. Chandrasekaran, B. Recht, P. A. Parrilo, and A. S. Willsky, The Convex Geometry of Linear Inverse Problems, Foundations of Computational Mathematics, vol.1, issue.10, pp.805-849
DOI : 10.1007/s10208-012-9135-7

S. Chatterjee, A. Guntuboyina, and B. Sen, On risk bounds in isotonic and other shape restricted regression problems, The Annals of Statistics, vol.43, issue.4, pp.1774-180015, 2015.
DOI : 10.1214/15-AOS1324SUPP

S. Chatterjee, A. Guntuboyina, and B. Sen, On matrix estimation under monotonicity constraints. arXiv preprint, 2015.

S. Chatterjee, A new perspective on least squares under convex constraint, The Annals of Statistics, vol.42, issue.6, pp.2340-238114, 2014.
DOI : 10.1214/14-AOS1254

X. Chen, Q. Lin, and B. Sen, On degrees of freedom of projection estimators with applications to multivariate shape restricted regression, 2015.

A. Cohen, All Admissible Linear Estimates of the Mean Vector, The Annals of Mathematical Statistics, vol.37, issue.2, pp.458-463, 1966.
DOI : 10.1214/aoms/1177699528

D. Dai, P. Rigollet, and T. Zhang, Deviation optimal learning using greedy $Q$-aggregation, The Annals of Statistics, vol.40, issue.3, pp.1878-190512, 2012.
DOI : 10.1214/12-AOS1025

D. Dai, P. Rigollet, L. Xia, and T. Zhang, Aggregation of affine estimators. Electron, J. Statist, vol.8, issue.1, pp.302-32714, 2014.

A. S. Dalalyan and A. B. Tsybakov, Sparse regression learning by aggregation and Langevin Monte-Carlo, Journal of Computer and System Sciences, vol.78, issue.5, pp.1423-1443
DOI : 10.1016/j.jcss.2011.12.023

URL : https://hal.archives-ouvertes.fr/hal-00362471

S. Arnak, J. Dalalyan, and . Salmon, Sharp oracle inequalities for aggregation of affine estimators, Ann. Statist, vol.40, issue.4, pp.2327-235512

S. Arnak, A. B. Dalalyan, and . Tsybakov, Aggregation by exponential weighting and sharp oracle inequalities, Learning theory, pp.97-111, 2007.

S. Arnak, A. B. Dalalyan, and . Tsybakov, Mirror averaging with sparsity priors, Bernoulli, vol.18, issue.3, pp.914-94411, 2012.

S. Arnak, M. Dalalyan, J. Hebiri, and . Lederer, On the prediction performance of the lasso. arXiv preprint, 2014.

H. Dette, A. Munk, and T. Wagner, Estimating the variance in nonparametric regression-what is a reasonable choice?, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.60, issue.4, pp.751-764, 1998.
DOI : 10.1111/1467-9868.00152

L. David, R. C. Donoho, B. Liu, and . Macgibbon, Minimax risk over hyperrectangles, and implications, Ann. Statist, vol.18, issue.3, pp.1416-1437, 1990.

L. David, I. M. Donoho, J. C. Johnstone, A. S. Hoch, and . Stern, Maximum entropy and the nearly black object, 1<41:MEATNB>2.0.CO;2-I&origin=MSN. With discussion and a reply by the authors, pp.41-81, 1992.

L. Dümbgen, Optimal confidence bands for shape-restricted curves, Bernoulli, vol.9, issue.3, pp.423-449, 2003.
DOI : 10.3150/bj/1065444812

S. Yu, M. S. Efro?-imovich, and . Pinsker, A self-training algorithm for nonparametric filtering, Avtomat. i Telemekh, issue.11, pp.58-65, 1984.

B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani, Least angle regression, With discussion, and a rejoinder by the authors, pp.407-499, 2004.

F. Gao and J. A. Wellner, Entropy estimate for high-dimensional monotonic functions, Journal of Multivariate Analysis, vol.98, issue.9, pp.1751-1764, 2007.
DOI : 10.1016/j.jmva.2006.09.003

S. Gerchinovitz, Prediction of individual sequences and prediction in the statistical framework: some links around sparse regression and aggregation techniques
URL : https://hal.archives-ouvertes.fr/tel-00653550

C. Giraud, Mixing least-squares estimators when the variance is unknown, Bernoulli, vol.14, issue.4, pp.1089-110708, 2008.
DOI : 10.3150/08-BEJ135

URL : https://hal.archives-ouvertes.fr/hal-00184869

C. Giraud, Introduction to high-dimensional statistics, volume 139 of Monographs on Statistics and Applied Probability, 2015.

C. Giraud, S. Huet, and N. Verzelen, High-Dimensional Regression with Unknown Variance, Statistical Science, vol.27, issue.4, pp.500-518
DOI : 10.1214/12-STS398SUPP

URL : https://hal.archives-ouvertes.fr/hal-00626630

A. Guntuboyina and B. Sen, Global risk bounds and adaptation in univariate convex regression, Probability Theory and Related Fields, vol.30, issue.2, pp.379-411, 2015.
DOI : 10.1007/s00440-014-0595-3

P. Hall, D. Kay, and . Titterinton, Asymptotically optimal difference-based estimation of variance in nonparametric regression, Biometrika, vol.77, issue.3, pp.521-528, 1990.
DOI : 10.1093/biomet/77.3.521

D. L. Hanson and F. T. Wright, A Bound on Tail Probabilities for Quadratic Forms in Independent Random Variables, The Annals of Mathematical Statistics, vol.42, issue.3, pp.1079-1083, 1971.
DOI : 10.1214/aoms/1177693335

M. Hebiri and J. Lederer, How Correlations Influence Lasso Prediction, IEEE Transactions on Information Theory, vol.59, issue.3, pp.1846-1854, 2013.
DOI : 10.1109/TIT.2012.2227680

URL : https://hal.archives-ouvertes.fr/hal-00686055

C. Heil, A Basis Theory Primer: Expanded Edition, 2011.
DOI : 10.1007/978-0-8176-4687-5

D. Hsu, S. M. Kakade, and T. Zhang, A tail inequality for quadratic forms of subgaussian random vectors, Electronic Communications in Probability, vol.17, issue.0, pp.17-2079, 2012.
DOI : 10.1214/ECP.v17-2079

J. Immerkaer, Fast Noise Variance Estimation, Computer Vision and Image Understanding, vol.64, issue.2, pp.300-302, 1996.
DOI : 10.1006/cviu.1996.0060

I. M. Johnstone, Function estimation and gaussian sequence models, Unpublished manuscript, 2002.

A. Juditsky, P. Rigollet, and A. B. Tsybakov, Learning by mirror averaging, The Annals of Statistics, vol.36, issue.5, pp.2183-220607, 2008.
DOI : 10.1214/07-AOS546

URL : https://hal.archives-ouvertes.fr/hal-00014097

G. Kerkyacharian, A. B. Tsybakov, V. Temlyakov, D. Picard, and V. Koltchinskii, Optimal Exponential Bounds on the Accuracy of Classification, Constructive Approximation, vol.45, issue.3, pp.421-444, 2014.
DOI : 10.1007/s00365-014-9229-3

URL : https://hal.archives-ouvertes.fr/hal-01019308

V. Koltchinskii, Local Rademacher complexities and oracle inequalities in risk minimization, The Annals of Statistics, vol.34, issue.6, pp.2593-2656, 2006.
DOI : 10.1214/009053606000001019

V. Koltchinskii, K. Lounici, and A. B. Tsybakov, Nuclearnorm penalization and optimal rates for noisy low-rank matrix completion
URL : https://hal.archives-ouvertes.fr/hal-00676868

R. Lata?a, Tail and moment estimates for some types of chaos, Studia Math, vol.135, issue.1, pp.39-53, 1999.

B. Laurent and P. Massart, Adaptive estimation of a quadratic functional of a density by model selection, ESAIM: Probability and Statistics, vol.9, issue.5, pp.1302-1338, 2000.
DOI : 10.1051/ps:2005001

G. Lecué, Lower bounds and aggregation in density estimation, J. Mach. Learn. Res, vol.7, pp.971-981, 2006.

G. Lecué, Empirical risk minimization is optimal for the convex aggregation problem, Bernoulli, vol.19, issue.5B, pp.2153-216612, 2013.
DOI : 10.3150/12-BEJ447

G. Lecué and S. Mendelson, Aggregation via empirical risk minimization. Probab. Theory Related Fields, pp.3-4591, 2009.

G. Lecué and S. Mendelson, Sharper lower bounds on the performance of the empirical risk minimization algorithm, Bernoulli, vol.16, issue.3, pp.605-613, 2010.
DOI : 10.3150/09-BEJ225

G. Lecué and S. Mendelson, On the optimality of the aggregate with exponential weights for low temperatures, Bernoulli, vol.19, issue.2, pp.646-67511, 2013.
DOI : 10.3150/11-BEJ408

G. Lecué and P. Rigollet, Optimal learning with Q-aggregation, The Annals of Statistics, vol.42, issue.1
DOI : 10.1214/13-AOS1190

G. Leung and A. R. Barron, Information Theory and Mixing Least-Squares Regressions, IEEE Transactions on Information Theory, vol.52, issue.8, pp.3396-3410, 2006.
DOI : 10.1109/TIT.2006.878172

]. K. Lounici, Generalized mirror averaging and D-convex aggregation, Mathematical Methods of Statistics, vol.16, issue.3, pp.246-259, 2007.
DOI : 10.3103/S1066530707030040

URL : https://hal.archives-ouvertes.fr/hal-00204674

J. Mairal and B. Yu, Complexity analysis of the lasso regularization path. arXiv preprint, 2012.

E. Mammen and S. Van-de-geer, Locally adaptive regression splines, The Annals of Statistics, vol.25, issue.1, pp.387-413, 1997.
DOI : 10.1214/aos/1034276635

P. Massart, Concentration inequalities and model selection ISBN 978-3-540-48497-4; 3-540-48497-3. Lectures from the 33rd Summer School on Probability Theory held in Saint-Flour, Lecture Notes in Mathematics, vol.1896, 2003.

J. Matousek and J. Vondrak, The probabilistic method, lecture notes, 2008.

B. Michael, J. A. Mccoy, and . Tropp, From Steiner formulas for cones to concentration of intrinsic volumes, Discrete Comput. Geom, vol.51, issue.4, pp.926-963

N. Meinshausen and B. Yu, Lasso-type recovery of sparse representations for high-dimensional data, The Annals of Statistics, vol.37, issue.1, pp.246-27007, 2009.
DOI : 10.1214/07-AOS582

S. Mendelson, On aggregation for heavy-tailed classes. arXiv preprint, 2015.

M. Meyer and M. Woodroofe, On the degrees of freedom in shaperestricted regression, Ann. Statist, vol.28, issue.4, pp.1083-1104, 2000.

A. Munk, N. Bissantz, T. Wagner, and G. Freitag, On difference-based variance estimation in nonparametric regression when the covariate is high dimensional, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.78, issue.1, pp.19-41, 2005.
DOI : 10.1111/1467-9469.00303

A. M. Nemirovski, B. T. Polyak, and T. A. , Rate of convergence of nonparametric estimators of maximum-likelihood type. Problems of Information Transmission, pp.258-272, 1985.

A. Nemirovski, Topics in non-parametric statistics In Lectures on probability theory and statistics (Saint-Flour, Lecture Notes in Math, vol.1738, pp.85-277, 1998.

S. Oymak and B. Hassibi, Sharp MSE Bounds for Proximal Denoising, Foundations of Computational Mathematics, vol.2, issue.1, pp.10208-10223, 2015.
DOI : 10.1007/s10208-015-9278-4

V. Pham, L. E. Ghaoui, and A. Fernandez, Robust sketching for multiple square-root lasso problems. arXiv preprint, 2014.

Y. Plan, R. Vershynin, and E. Yudovina, High-dimensional estimation with geometric constraints. arXiv preprint, 2014.

A. Rakhlin, K. Sridharan, and A. B. Tsybakov, Empirical entropy, minimax regret and minimax risk, Bernoulli, vol.23, issue.2, 2013.
DOI : 10.3150/14-BEJ679

A. Rakhlin, K. Sridharan, and . Tsybakov, Empirical entropy, minimax regret and minimax risk, Bernoulli, vol.23, issue.2, 2013.
DOI : 10.3150/14-BEJ679

. Ph, A. B. Rigollet, and . Tsybakov, Linear and convex aggregation of density estimators, Math. Methods Statist, vol.16, issue.3, pp.260-280, 2007.

P. Rigollet, Kullback???Leibler aggregation and misspecified generalized linear models, The Annals of Statistics, vol.40, issue.2, pp.639-66511, 2012.
DOI : 10.1214/11-AOS961SUPP

P. Rigollet and A. Tsybakov, Exponential Screening and optimal rates of sparse estimation, The Annals of Statistics, vol.39, issue.2, pp.731-77110, 2011.
DOI : 10.1214/10-AOS854

URL : https://hal.archives-ouvertes.fr/hal-00606059

P. Rigollet and A. Tsybakov, Exponential Screening and optimal rates of sparse estimation, The Annals of Statistics, vol.39, issue.2, pp.731-77110, 2011.
DOI : 10.1214/10-AOS854

URL : https://hal.archives-ouvertes.fr/hal-00606059

P. Rigollet and A. B. Tsybakov, Sparse Estimation by Exponential Weighting, Statistical Science, vol.27, issue.4, pp.558-57512, 2012.
DOI : 10.1214/12-STS393

M. Rudelson and R. Vershynin, Hanson-Wright inequality and sub-gaussian concentration, Electronic Communications in Probability, vol.18, issue.0, pp.18-2865, 2013.
DOI : 10.1214/ECP.v18-2865

URL : http://arxiv.org/abs/1306.2872

T. Sun and C. Zhang, Scaled sparse linear regression, Biometrika, vol.99, issue.4, pp.879-898, 2012.
DOI : 10.1093/biomet/ass043

URL : http://arxiv.org/abs/1104.4595

A. B. Tsybakov, Aggregation and minimax optimality in high-dimensional estimation, Proceedings of the International Congress of Mathematicians

A. B. Tsybakov, Aggregation and minimax optimality in high dimensional estimation, Proceedings of International Congress of Mathematicians, pp.225-246, 2014.

A. B. Tsybakov, chapter Optimal Rates of Aggregation, Science + Business Media, pp.303-313978, 2003.

A. B. Tsybakov, Introduction to nonparametric estimation Springer Series in Statistics, 2009.

S. Van-de-geer and J. Lederer, The Lasso, correlated design, and improved oracle inequalities In From probability to statistics and back: highdimensional models and processes, volume 9 of Inst, Math. Stat. (IMS) Collect. Inst. Math. Statist, pp.303-31612, 2013.

V. N. Vapnik and A. Ya, Chervonenkis. Teoriya raspoznavaniya obrazov. Statisticheskie problemy obucheniya. Izdat, Nauka, 1974.

R. Vershynin, Introduction to the non-asymptotic analysis of random matrices. arXiv preprint, 2010.

R. Vershynin, Estimation in high dimensions: a geometric perspective. arXiv preprint, 2014.

M. H. Wegkamp, Quasi-universal bandwidth selection for kernel density estimators. Canad, J. Statist, vol.27, issue.2, pp.409-420, 1999.

F. T. Wright, A Bound on Tail Probabilities for Quadratic Forms in Independent Random Variables Whose Distributions are not Necessarily Symmetric, The Annals of Probability, vol.1, issue.6, pp.1068-1070, 1973.
DOI : 10.1214/aop/1176996815

Y. Yang, Mixing strategies for density estimation, The Annals of Statistics, vol.28, issue.1, pp.75-87, 2000.
DOI : 10.1214/aos/1016120365

C. Zhang, Risk bounds in isotonic regression, The Annals of Statistics, vol.30, issue.2, pp.528-555, 2002.
DOI : 10.1214/aos/1021379864

@. Enfin and . Dans-le-chapitre, nous construisons des ensembles de confiance dans le contexte de la régression sous contrainte de forme Le chapitre 7 prouve l'existence d'ensembles de confiance qui capturent la vrai fonction avec grande probabilité et dont le diamètre est de l'ordre de la vitesse minimax, cf, 2009.