Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
DOI10.1007/s10463-011-0343-8zbMath1440.62111OpenAlexW2155183960MaRDI QIDQ1926013
Takafumi Kanamori, Masashi Sugiyama, Taiji Suzuki
Publication date: 27 December 2012
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10463-011-0343-8
logistic regressionBregman divergencedensity ratiokernel mean matchingKullback-Leibler importance estimation procedureleast-squares importance fitting
Density estimation (62G07) Asymptotic properties of nonparametric inference (62G20) Nonparametric estimation (62G05) Generalized linear models (logistic models) (62J12) Learning and adaptive systems in artificial intelligence (68T05) Statistical aspects of information-theoretic topics (62B10)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Direct importance estimation for covariate shift adaptation
- Statistical analysis of kernel-based least-squares density-ratio estimation
- Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search
- Robust parameter estimation with a small bias against heavy contamination
- A compact formulation of an elastoplastic analysis problem
- Dual representation of \(\phi\)-divergences and applications.
- Improving predictive inference under covariate shift by weighting the log-likelihood function
- Semiparametric density estimation under a two-sample density ratio model
- Least angle regression. (With discussion)
- Least-squares two-sample test
- An introduction to variational methods for graphical models
- Test of homogeneity in semiparametric two-sample density ratio models
- A comparison of related density-based minimum divergence estimators
- 10.1162/153244302760185252
- Density Ratio Estimation in Machine Learning
- Least-Squares Independent Component Analysis
- Input-dependent estimation of generalization error under covariate shift
- Estimating Squared-Loss Mutual Information for Independent Component Analysis
- Atomic Decomposition by Basis Pursuit
- Robust and efficient estimation by minimising a density power divergence
- Inferences for case-control and semiparametric two-sample density ratio models
- Information Geometry of U-Boost and Bregman Divergence
- Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
- Elements of Information Theory
- Convex Analysis
- On Information and Sufficiency
- Robust Statistics
- The elements of statistical learning. Data mining, inference, and prediction
- Logistic regression, AdaBoost and Bregman distances