Feature selection via dependence maximization
From MaRDI portal
Recommendations
- High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso
- Feature selection via maximizing fuzzy dependency
- Conditional likelihood maximisation: a unifying framework for information theoretic feature selection
- Diverse data selection via combinatorial quasi-concavity of distance covariance: a polynomial time global minimax algorithm
- 10.1162/153244303322753733
Cited in
(66)- Minimax estimation of kernel mean embeddings
- The Monte Carlo feature selection and interdependency discovery is unbiased
- scientific article; zbMATH DE number 1759572 (Why is no real title available?)
- Feature Selection via Coalitional Game Theory
- Feature selection from local lift dependence-based partitions
- Large-scale kernel methods for independence testing
- Unsupervised group matching with application to cross-lingual topic matching without alignment information
- A factor graph model for unsupervised feature selection
- Conditional likelihood maximisation: a unifying framework for information theoretic feature selection
- Dependency maximization forward feature selection algorithms based on normalized cross-covariance operator and its approximated form for high-dimensional data
- Feature selection with SVD entropy: some modification and extension
- An Updated Literature Review of Distance Correlation and Its Applications to Time Series
- Classifier-dependent feature selection via greedy methods
- Multi-graph fusion graph convolutional networks with pseudo-label supervision
- Goodness-of-fit test for nonparametric regression models: smoothing spline ANOVA models as example
- Supervised principal component analysis: visualization, classification and regression on subspaces and submanifolds
- Shrinkage estimation of higher-order Bochner integrals
- Alternative second-order cone programming formulations for support vector classification
- Fast multi-label feature selection based on information-theoretic feature ranking
- On sufficient variable screening using log odds ratio filter
- Efficient feature selection via analysis of relevance and redundancy
- Sufficient variable selection using independence measures for continuous response
- An overview of recent advancements in causal studies
- The randomized information coefficient: assessing dependencies in noisy data
- Unbalanced optimal transport and maximum mean discrepancies: interconnections and rapid evaluation
- Kernel-based measures of association
- Large margin subspace learning for feature selection
- A General Framework of Nonparametric Feature Selection in High-Dimensional Data
- Fuzzy rough based regularization in generalized multiple kernel learning
- A Hilbert Space Embedding for Distributions
- Wasserstein filter for variable screening in binary classification in the reproducing kernel Hilbert space
- Learning to select features using their properties
- Non-parametric classifier-independent feature selection
- Advanced conjoint analysis using feature selection via support vector machines
- Global sensitivity analysis with dependence measures
- Bias reduction and metric learning for nearest-neighbor estimation of Kullback-Leibler divergence
- scientific article; zbMATH DE number 7370644 (Why is no real title available?)
- Feature selection for support vector machines via mixed integer linear programming
- Diffusion \(K\)-means clustering on manifolds: provable exact recovery via semidefinite relaxations
- Budget constrained non-monotonic feature selection
- Ranking Features to Promote Diversity: An Approach Based on Sparse Distance Correlation
- Characteristic and universal tensor product kernels
- A kernel- and optimal transport- based test of independence between covariates and right-censored lifetimes
- Subspace learning for unsupervised feature selection via matrix factorization
- The Max-Min approach to feature selection: Its foundations and practical potential
- Grouped feature importance and combined features effect plot
- Hanson-Wright inequality in Hilbert spaces with application to \(K\)-means clustering for non-Euclidean data
- An instance voting approach to feature selection
- The exact equivalence of distance and kernel methods in hypothesis testing
- High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso
- Maximum weight and minimum redundancy: a novel framework for feature subset selection
- Analyzing high dimensional correlated data using feature ranking and classifiers
- Sparse optimization in feature selection: application in neuroimaging
- Toward a Fundamental Theory of Optimal Feature Selection: Part I
- Optimal rates for independence testing via U-statistic permutation tests
- Sparse parameter identification of stochastic dynamical systems
- Goal-oriented sensitivity analysis of hyperparameters in deep learning
- Sparse nonparametric regression with regularized tensor product kernel
- A robust-equitable measure for feature ranking and selection
- scientific article; zbMATH DE number 7750671 (Why is no real title available?)
- Neural decoding with kernel-based metric learning
- Estimation of Bergsma's covariance
- Diverse data selection via combinatorial quasi-concavity of distance covariance: a polynomial time global minimax algorithm
- Support Recovery and Parameter Identification of Multivariate ARMA Systems with Exogenous Inputs
- Constructing effective personalized policies using counterfactual inference from biased data sets with many features
- Feature selection via maximizing fuzzy dependency
This page was built for publication: Feature selection via dependence maximization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5405165)