Dimensionality reduction for density ratio estimation in high-dimensional spaces
From MaRDI portal
Publication:1784536
Recommendations
- Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search
- Online direct density-ratio estimation applied to inlier-based outlier detection
- Statistical analysis of kernel-based least-squares density-ratio estimation
- Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
- Density ratio estimation in machine learning. Foreword by Thomas G. Dietterich
Cites work
- scientific article; zbMATH DE number 5957198 (Why is no real title available?)
- scientific article; zbMATH DE number 5957245 (Why is no real title available?)
- scientific article; zbMATH DE number 5957325 (Why is no real title available?)
- scientific article; zbMATH DE number 425941 (Why is no real title available?)
- scientific article; zbMATH DE number 47593 (Why is no real title available?)
- scientific article; zbMATH DE number 107482 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 845707 (Why is no real title available?)
- scientific article; zbMATH DE number 854710 (Why is no real title available?)
- scientific article; zbMATH DE number 961607 (Why is no real title available?)
- scientific article; zbMATH DE number 3068126 (Why is no real title available?)
- A Distribution Free Version of the Smirnov Two Sample Test in the $p$-Variate Case
- A least-squares approach to direct importance estimation
- A new algorithm of non-Gaussian component analysis with radial kernel functions
- Active learning algorithm using the maximum weighted log-likelihood estimator
- Adaptive importance sampling for value function approximation in off-policy reinforcement learning
- An interior-point method for large-scale \(l_1\)-regularized logistic regression
- Computational complexity of kernel-based density-ratio estimation: a condition number analysis
- Covariate shift adaptation by importance weighted cross validation
- Dimensionality reduction of multimodal labeled data by local Fisher discriminant analysis
- Direct importance estimation for covariate shift adaptation
- Edgeworth Approximation of Multivariate Differential Entropy
- Estimating Squared-Loss Mutual Information for Independent Component Analysis
- Estimating the support of a high-dimensional distribution
- Improving predictive inference under covariate shift by weighting the log-likelihood function
- Inferences for case-control and semiparametric two-sample density ratio models
- Input-dependent estimation of generalization error under covariate shift
- Least angle regression. (With discussion)
- Multivariate generalizations of the Wald-Wolfowitz and Smirnov two-sample tests
- Nonparametric Conditional Density Estimation Using Piecewise-Linear Solution Path of Kernel Quantile Regression
- Nonparametric and semiparametric models.
- On Information and Sufficiency
- On the Asymptotic Properties of a Nonparametric<tex>$L_1$</tex>-Test Statistic of Homogeneity
- On the influence of the kernel on the consistency of support vector machines
- Pattern classification.
- Pattern recognition and machine learning.
- Permutation tests for equality of distributions in high-dimensional settings
- Pool-based active learning in approximate linear regression
- Principal component analysis.
- Regression and the Moore-Penrose pseudoinverse
- Robust weights and designs for biased regression models: Least squares and generalized \(M\)-estimation
- Semiparametric density estimation under a two-sample density ratio model
- Soft margins for AdaBoost
- Sufficient dimension reduction via squared-loss mutual information estimation
- Two-sample test statistics for measuring discrepancies between two multivariate probability density functions using kernel-based density estimates
Cited in
(11)- Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search
- Change-point detection in time-series data by relative density-ratio estimation
- Necessary and sufficient conditions of proper estimators based on self density ratio for unnormalized statistical models
- Online direct density-ratio estimation applied to inlier-based outlier detection
- Sequential change‐point detection based on direct density‐ratio estimation
- Learning under nonstationarity: covariate shift and class-balance change
- Density ratio estimation in machine learning. Foreword by Thomas G. Dietterich
- Machine learning with squared-loss mutual information
- Density-difference estimation
- Least-squares two-sample test
- Probability density function estimation with the frequency polygon transform
This page was built for publication: Dimensionality reduction for density ratio estimation in high-dimensional spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1784536)