Regularization and the small-ball method. I: Sparse recovery (Q1750281): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
Property / cites work
 
Property / cites work: Asymptotic Geometric Analysis, Part I / rank
 
Normal rank
Property / cites work
 
Property / cites work: Simultaneous analysis of Lasso and Dantzig selector / rank
 
Normal rank
Property / cites work
 
Property / cites work: SLOPE-adaptive variable selection via convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Statistics for high-dimensional data. Methods, theory and applications. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements / rank
 
Normal rank
Property / cites work
 
Property / cites work: Recovering Low-Rank Matrices From Few Coefficients in Any Basis / rank
 
Normal rank
Property / cites work
 
Property / cites work: Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion / rank
 
Normal rank
Property / cites work
 
Property / cites work: Bounding the Smallest Singular Value of a Random Matrix Without Concentration / rank
 
Normal rank
Property / cites work
 
Property / cites work: Regularization and the small-ball method II: complexity dependent error rates / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sparse recovery under weak moment assumptions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Regularization and the small-ball method. I: Sparse recovery / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators / rank
 
Normal rank
Property / cites work
 
Property / cites work: High-dimensional graphs and variable selection with the Lasso / rank
 
Normal rank
Property / cites work
 
Property / cites work: Lasso-type recovery of sparse representations for high-dimensional data / rank
 
Normal rank
Property / cites work
 
Property / cites work: Learning without concentration for general loss functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Learning without Concentration / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Remark on the Diameter of Random Sections of Convex Bodies / rank
 
Normal rank
Property / cites work
 
Property / cites work: Upper bounds on product and multiplier empirical processes / rank
 
Normal rank
Property / cites work
 
Property / cites work: Restricted strong convexity and weighted matrix completion: Optimal bounds with noise / rank
 
Normal rank
Property / cites work
 
Property / cites work: A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers / rank
 
Normal rank
Property / cites work
 
Property / cites work: Confidence sets in sparse regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Estimation of high-dimensional low-rank matrices / rank
 
Normal rank
Property / cites work
 
Property / cites work: Small Ball Probabilities for Linear Images of High-Dimensional Distributions / rank
 
Normal rank
Property / cites work
 
Property / cites work: SLOPE is adaptive to unknown sparsity and asymptotically minimax / rank
 
Normal rank
Property / cites work
 
Property / cites work: Upper and Lower Bounds for Stochastic Processes / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4864293 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Weakly decomposable regularization penalties and structured sparsity / rank
 
Normal rank
Property / cites work
 
Property / cites work: On asymptotically optimal confidence regions and tests for high-dimensional models / rank
 
Normal rank
Property / cites work
 
Property / cites work: High-dimensional generalized linear models and the lasso / rank
 
Normal rank
Property / cites work
 
Property / cites work: Characterization of the subdifferential of some matrix norms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3174050 / rank
 
Normal rank

Revision as of 15:31, 15 July 2024

scientific article
Language Label Description Also known as
English
Regularization and the small-ball method. I: Sparse recovery
scientific article

    Statements

    Regularization and the small-ball method. I: Sparse recovery (English)
    0 references
    0 references
    0 references
    18 May 2018
    0 references
    Let \((\Omega, \mu)\) be a probability space and set \(X\) to be distributed according to \(\mu\). \(F\) is a class of real-valued functions defined on \(\Omega\), \(Y\) is the unknown random variable that one would like to approximate using function in \(F\) and \(\lambda\) is the regularization parameter. The authors discuss the best approximation to \(y\) and find the function \(f^*\) that minimizes in \(F\) the squared loss functional \(f\to E(f(x)-y)^2\).
    0 references
    empirical processes
    0 references
    high dimensional statistics
    0 references
    0 references
    0 references

    Identifiers