Learning with optimal interpolation norms
DOI10.1007/S11075-018-0568-1zbMATH Open1454.90049arXiv1603.09273OpenAlexW2753852770WikidataQ129562610 ScholiaQ129562610MaRDI QIDQ2420165FDOQ2420165
Authors: Patrick L. Combettes, Andrew M. Mcdonald, Charles A. Micchelli, Massimiliano Pontil
Publication date: 5 June 2019
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1603.09273
Recommendations
machine learningDouglas-Rachford splittingblock-coordinate proximal algorithminfimal postcompositionlatent group lassooptimal interpolation norm
Convex programming (90C25) Monotone operators and generalizations (47H05) Programming in abstract spaces (90C48)
Cites Work
- Sparsity and Smoothness Via the Fused Lasso
- Model Selection and Estimation in Regression with Grouped Variables
- Proximal methods for hierarchical sparse coding
- Tensor Decompositions and Applications
- The composite absolute penalties family for grouped and hierarchical variable selection
- Convex Analysis
- Title not available (Why is that?)
- Optimization with sparsity-inducing penalties
- Proximity algorithms for image models: denoising
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- Title not available (Why is that?)
- A new approach in interpolation spaces
- Learning the kernel function via regularization
- Systems of Structured Monotone Inclusions: Duality, Algorithms, and Applications
- Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping
- The convex geometry of linear inverse problems
- Using Block Norms for Location Modeling
- Convex multi-task feature learning
- Learning with tensors: a framework based on convex optimization and spectral regularization
- Convex analysis and monotone operator theory in Hilbert spaces
- An algorithm for splitting parallel sums of linearly composed monotone operators, with applications to signal recovery
- Super-Resolution With Sparse Mixing Estimators
- Title not available (Why is that?)
- Feature space perspectives for learning the kernel
- Robust 2D Principal Component Analysis: A Structured Sparsity Regularized Approach
- Fundamental Performance Limits for Ideal Decoders in High-Dimensional Linear Inverse Problems
- Background Subtraction Based on Low-Rank and Structured Sparse Decomposition
- On spectral learning
- Prolongements de foncteurs d'interpolation et applications
- New perspectives on \(k\)-support and cluster norms
- Exploring Structured Sparsity by a Reweighted Laplace Prior for Hyperspectral Compressive Sensing
- Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping. II: Mean-square and linear convergence
- Regularizers for structured sparsity
- Structured sparsity and generalization
Cited In (1)
This page was built for publication: Learning with optimal interpolation norms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2420165)