Dimensionality reduction for density ratio estimation in high-dimensional spaces
DOI10.1016/J.NEUNET.2009.07.007zbMATH Open1401.62097OpenAlexW2119410311WikidataQ44528641 ScholiaQ44528641MaRDI QIDQ1784536FDOQ1784536
Authors: Masashi Sugiyama, Motoaki Kawanabe, Pui Ling Chui
Publication date: 27 September 2018
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2009.07.007
Recommendations
- Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search
- Online direct density-ratio estimation applied to inlier-based outlier detection
- Statistical analysis of kernel-based least-squares density-ratio estimation
- Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
- Density ratio estimation in machine learning. Foreword by Thomas G. Dietterich
dimensionality reductiondensity ratio estimationlocal Fisher discriminant analysisunconstrained least-squares importance fitting
Density estimation (62G07) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Principal component analysis.
- Least angle regression. (With discussion)
- Title not available (Why is that?)
- Inferences for case-control and semiparametric two-sample density ratio models
- Pattern recognition and machine learning.
- Title not available (Why is that?)
- On Information and Sufficiency
- Two-sample test statistics for measuring discrepancies between two multivariate probability density functions using kernel-based density estimates
- Pattern classification.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Multivariate generalizations of the Wald-Wolfowitz and Smirnov two-sample tests
- Nonparametric and semiparametric models.
- Title not available (Why is that?)
- Permutation tests for equality of distributions in high-dimensional settings
- A Distribution Free Version of the Smirnov Two Sample Test in the $p$-Variate Case
- Direct importance estimation for covariate shift adaptation
- Improving predictive inference under covariate shift by weighting the log-likelihood function
- On the influence of the kernel on the consistency of support vector machines
- Covariate shift adaptation by importance weighted cross validation
- Input-dependent estimation of generalization error under covariate shift
- Title not available (Why is that?)
- Soft margins for AdaBoost
- Title not available (Why is that?)
- Dimensionality reduction of multimodal labeled data by local Fisher discriminant analysis
- Estimating the support of a high-dimensional distribution
- Title not available (Why is that?)
- A least-squares approach to direct importance estimation
- Nonparametric Conditional Density Estimation Using Piecewise-Linear Solution Path of Kernel Quantile Regression
- An interior-point method for large-scale \(l_1\)-regularized logistic regression
- Regression and the Moore-Penrose pseudoinverse
- Edgeworth Approximation of Multivariate Differential Entropy
- A new algorithm of non-Gaussian component analysis with radial kernel functions
- Title not available (Why is that?)
- Computational complexity of kernel-based density-ratio estimation: a condition number analysis
- On the Asymptotic Properties of a Nonparametric<tex>$L_1$</tex>-Test Statistic of Homogeneity
- Semiparametric density estimation under a two-sample density ratio model
- Active learning algorithm using the maximum weighted log-likelihood estimator
- Title not available (Why is that?)
- Sufficient dimension reduction via squared-loss mutual information estimation
- Adaptive importance sampling for value function approximation in off-policy reinforcement learning
- Pool-based active learning in approximate linear regression
- Robust weights and designs for biased regression models: Least squares and generalized \(M\)-estimation
- Estimating Squared-Loss Mutual Information for Independent Component Analysis
- Title not available (Why is that?)
Cited In (11)
- Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search
- Change-point detection in time-series data by relative density-ratio estimation
- Online direct density-ratio estimation applied to inlier-based outlier detection
- Necessary and sufficient conditions of proper estimators based on self density ratio for unnormalized statistical models
- Sequential change‐point detection based on direct density‐ratio estimation
- Learning under nonstationarity: covariate shift and class-balance change
- Density ratio estimation in machine learning. Foreword by Thomas G. Dietterich
- Machine learning with squared-loss mutual information
- Density-difference estimation
- Least-squares two-sample test
- Probability density function estimation with the frequency polygon transform
Uses Software
This page was built for publication: Dimensionality reduction for density ratio estimation in high-dimensional spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1784536)