10.1162/153244303322753751
From MaRDI portal
Publication:4827827
DOI10.1162/153244303322753751zbMath1102.68605OpenAlexW4251871045MaRDI QIDQ4827827
Mike Tipping, Jason Weston, Bernhard Schölkopf, André Elisseeff
Publication date: 23 November 2004
Published in: CrossRef Listing of Deleted DOIs (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/153244303322753751
Related Items
Exact penalization for cardinality and rank-constrained optimization problems via partial regularization, Tracking hedge funds returns using sparse clones, Mathematical optimization in classification and regression trees, Non-parametric classifier-independent feature selection, Primal explicit max margin feature selection for nonlinear support vector machines, Finding causative genes from high-dimensional data: an appraisal of statistical and machine learning approaches, A concave optimization-based approach for sparse portfolio selection, DC approximation approaches for sparse optimization, Sparse and robust normal and \(t\)-portfolios by penalized \(L_q\)-likelihood minimization, Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee, Sparse index clones via the sorted ℓ1-Norm, Lagrangian relaxation for SVM feature selection, Supervised classification and mathematical optimization, Embedded variable selection method using signomial classification, Budget constrained non-monotonic feature selection, Using zero-norm constraint for sparse probability density function estimation, Second-order optimality conditions and improved convergence results for regularization methods for cardinality-constrained optimization problems, Massively parallel feature selection: an approach based on variance preservation, Global convergence of proximal iteratively reweighted algorithm, High dimensional data classification and feature selection using support vector machines, Subset selection for multiple linear regression via optimization, An iterative SVM approach to feature selection and classification in high-dimensional datasets, Feature selection for linear SVMs under uncertain data: robust optimization based on difference of convex functions algorithms, Feature selection for linear SVM with provable guarantees, Feature selection in machine learning: an exact penalty approach using a difference of convex function algorithm, Pruning of error correcting output codes by optimization of accuracy-diversity trade off, Zero-norm regularized problems: equivalent surrogates, proximal MM method and statistical error bound, Sparse optimization via vector \(k\)-norm and DC programming with an application to feature selection for support vector machines, A random block-coordinate Douglas-Rachford splitting method with low computational complexity for binary logistic regression, A unifying framework for sparsity-constrained optimization, A smoothing method for sparse optimization over convex sets, On multivariate randomized classification trees: \(l_0\)-based sparsity, VC dimension and decomposition methods, A majorization-minimization approach to the sparse generalized eigenvalue problem, Distributed nonconvex constrained optimization over time-varying digraphs, Genetic algorithm versus classical methods in sparse index tracking, Constructing effective personalized policies using counterfactual inference from biased data sets with many features, Sparse canonical correlation analysis, Variable selection for multicategory SVM via adaptive sup-norm regularization, Variable selection for binary classification in large dimensions: comparisons and application to microarray data, An extrapolated proximal iteratively reweighted method for nonconvex composite optimization problems, Joint Feature Transformation and Selection Based on Dempster-Shafer Theory, Sparse weighted voting classifier selection and its linear programming relaxations, Learning sparse gradients for variable selection and dimension reduction, A DC programming approach for feature selection in support vector machines learning, The Support Feature Machine: Classification with the Least Number of Features and Application to Neuroimaging Data, Stochastic correlation coefficient ensembles for variable selection, Convex Optimization for Group Feature Selection in Networked Data, Assessing variable importance in clustering: a new method based on unsupervised binary decision trees, Feature elimination in kernel machines in moderately high dimensions, Combined SVM-based feature selection and classification, Learning modulo theories for constructive preference elicitation, Feature selection for support vector machines via mixed integer linear programming, DCA based algorithms for feature selection in multi-class support vector machine, DC programming and DCA: thirty years of developments, Sparse optimization in feature selection: application in neuroimaging, Solving \(\ell_0\)-penalized problems with simple constraints via the Frank-Wolfe reduced dimension method, A factor graph model for unsupervised feature selection, Salt and pepper noise removal based on an approximation of \(l_0\) norm, Combined SVM-based feature selection and classification, Gene expression modeling through positive Boolean functions, A sparsity driven kernel machine based on minimizing a generalization error bound, Feature selection for high-dimensional data, Supervised Learning by Support Vector Machines, Concave programming for minimizing the zero-norm over polyhedral sets, Convergent inexact penalty decomposition methods for cardinality-constrained problems, Arbitrary Norm Support Vector Machines, An effective procedure for feature subset selection in logistic regression based on information criteria, Successive convex approximations to cardinality-constrained convex programs: a piecewise-linear DC approach, Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimina\-tion, The sparse signomial classification and regression model, DC Programming Approach for a Class of Nonconvex Programs Involving l 0 Norm, Feature selection in SVM via polyhedral \(k\)-norm, Nonconvex and nonsmooth sparse optimization via adaptively iterative reweighted methods, Sparse Recovery via Partial Regularization: Models, Theory, and Algorithms, D.C. programming for sparse proximal support vector machines, Concave programming for finding sparse solutions to problems with convex constraints, Variable Selection for Support Vector Machines, Un-diversifying during crises: is it a good idea?, Spatially adaptive binary classifier using B-splines and total variation penalty, A theoretical understanding of self-paced learning, Double regularization methods for robust feature selection and SVM classification via DC programming, An interior point method for \(L_{1 / 2}\)-SVM and application to feature selection in classification, Constructing optimal sparse portfolios using regularization methods
Uses Software