Optimization on the hierarchical Tucker manifold - applications to tensor completion
DOI10.1016/J.LAA.2015.04.015zbMATH Open1317.65136arXiv1405.2096OpenAlexW301566927MaRDI QIDQ2350002FDOQ2350002
Authors: Curt Da Silva, F. J. Herrmann
Publication date: 18 June 2015
Published in: Linear Algebra and its Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1405.2096
Recommendations
- Hypercomplex Tensor Completion via Convex Optimization
- Riemannian optimization for high-dimensional tensor completion
- Tensor completion in hierarchical tensor representations
- On the optimization landscape of tensor decompositions
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- Low-rank tensor completion by Riemannian optimization
- Nonconvex optimization for robust tensor completion from grossly sparse observations
- Nonlinearly preconditioned optimization on Grassmann manifolds for computing approximate Tucker tensor decompositions
- Tensor methods for nonlinear matrix completion
- Tensor principal component analysis via convex optimization
conjugate gradientalgorithmdifferential geometrysteepest descentGauss-Newtontensor completionlow-rank tensorhierarchical Tucker tensorsrecursive subspace factorizationsRiemannian manifold optimization
Numerical mathematical programming methods (65K05) Complexity and performance of numerical algorithms (65Y20) Multilinear algebra, tensor calculus (15A69) Methods of local Riemannian geometry (53B21) Programming in abstract spaces (90C48)
Cites Work
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- A Singular Value Thresholding Algorithm for Matrix Completion
- Title not available (Why is that?)
- Tensor Decompositions and Applications
- Exact matrix completion via convex optimization
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Tensor-train decomposition
- Hierarchical Singular Value Decomposition of Tensors
- Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions
- A Multilinear Singular Value Decomposition
- Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
- A block coordinate descent method for regularized multiconvex optimization with applications to nonnegative tensor factorization and completion
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- Low-rank tensor completion by Riemannian optimization
- The natural pseudo-distance as a quotient pseudo-metric, and applications
- Tensor spaces and numerical tensor calculus
- Local convergence of the alternating least squares algorithm for canonical tensor approximation
- A literature survey of low-rank tensor approximation techniques
- Tensor completion in hierarchical tensor representations
- From quantum to classical molecular dynamics: Reduced models and numerical analysis.
- A new scheme for the tensor representation
- The alternating linear scheme for tensor optimization in the tensor train format
- A Newton-Grassmann method for computing the best multilinear rank-\((r_1,r_2,r_3)\) approximation of a tensor
- Solution of Linear Systems and Matrix Inversion in the TT-Format
- Algorithm 941: \texttt{htucker} -- a Matlab toolbox for tensors in hierarchical Tucker format
- New inexact line search method for unconstrained optimization
- Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices
- On manifolds of tensors of fixed TT-rank
- Dynamical approximation by hierarchical Tucker and tensor-train tensors
- The geometry of algorithms using hierarchical tensors
- Convergence of line search methods for unconstrained optimization
- Black box approximation of tensors in hierarchical Tucker format
- Low-rank optimization with trace norm penalty
- Approximation rates for the hierarchical tensor format in periodic Sobolev spaces
- Tree adaptive approximation in the hierarchical tensor format
- Convergence results for projected line-search methods on varieties of low-rank matrices via Łojasiewicz inequality
- Conjugate Gradient Iterative Hard Thresholding: Observed Noise Stability for Compressed Sensing
- Title not available (Why is that?)
Cited In (27)
- Dynamically orthogonal tensor methods for high-dimensional nonlinear PDEs
- Enabling numerically exact local solver for waveform inversion -- a low-rank approach
- Analysis of asymptotic escape of strict saddle sets in manifold optimization
- The numerical approximation of nonlinear functionals and functional differential equations
- Stable als approximation in the TT-format for rank-adaptive tensor completion
- Preconditioned low-rank Riemannian optimization for linear systems with tensor product structure
- Riemannian optimization for high-dimensional tensor completion
- Low-rank tensor methods for partial differential equations
- Iterative methods based on soft thresholding of hierarchical tensors
- A two-stage surrogate model for neo-Hookean problems based on adaptive proper orthogonal decomposition and hierarchical tensor approximation
- Variants of alternating least squares tensor completion in the tensor train format
- A tensor train approach for internet traffic data completion
- New Riemannian preconditioned algorithms for tensor completion via polyadic decomposition
- Adaptive integration of nonlinear evolution equations on tensor manifolds
- T-product factorization method for internet traffic data completion with spatio-temporal regularization
- Tensor completion in hierarchical tensor representations
- Iterative algorithms for the post-processing of high-dimensional data
- Stability analysis of hierarchical tensor methods for time-dependent PDEs
- A Riemannian gradient sampling algorithm for nonsmooth optimization on manifolds
- Minimality of tensors of fixed multilinear rank
- The Condition Number of Riemannian Approximation Problems
- A Riemannian trust region method for the canonical tensor rank approximation problem
- Tensor networks and hierarchical tensors for the solution of high-dimensional partial differential equations
- Low-rank tensor recovery using sequentially optimal modal projections in iterative hard thresholding (SeMPIHT)
- Parallel tensor methods for high-dimensional linear PDEs
- A TT-based hierarchical framework for decomposing high-order tensors
- Constrained optimization with low-rank tensors and applications to parametric problems with PDEs
Uses Software
This page was built for publication: Optimization on the hierarchical Tucker manifold - applications to tensor completion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2350002)