Direct importance estimation for covariate shift adaptation
From MaRDI portal
Publication:144623
DOI10.1007/s10463-008-0197-xzbMath1294.62069OpenAlexW2062291443MaRDI QIDQ144623
Shinichi Nakajima, Taiji Suzuki, Motoaki Kawanabe, Paul Von Bünau, Hisashi Kashima, Masashi Sugiyama, Masashi Sugiyama, Shinichi Nakajima, Motoaki Kawanabe, Taiji Suzuki, Hisashi Kashima, Paul von Bünau
Publication date: 30 August 2008
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10463-008-0197-x
importance samplingKullback-Leibler divergencemodel misspecificationcovariate shiftlikelihood cross validation
Related Items
Statistical analysis of distance estimators with density differences and density ratios, Adapting a classification rule to local and global shift when only unlabelled data are available, Interpretable domain adaptation via optimization over the Stiefel manifold, One class proximal support vector machines, Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation, Domain adaptation and sample bias correction theory and algorithm for regression, Equal percent bias reduction and variance proportionate modifying properties with mean-covariance preserving matching, Positive-unlabeled classification under class-prior shift: a prior-invariant approach based on density ratio estimation, Cross-domain decision making with parameter transfer based on value function, Unnamed Item, Estimating Density Ratio of Marginals to Joint: Applications to Causal Inference, Statistical analysis of kernel-based least-squares density-ratio estimation, Computational complexity of kernel-based density-ratio estimation: a condition number analysis, Least-squares two-sample test, Transfer estimation of evolving class priors in data stream classification, Relative deviation learning bounds and generalization with unbounded loss functions, Pool-based active learning in approximate linear regression, A theory of learning from different domains, Density-Difference Estimation, Information-Maximization Clustering Based on Squared-Loss Mutual Information, Active learning for noisy oracle via density power divergence, Direct Learning of Sparse Changes in Markov Networks by Density Ratio Estimation, Semi-supervised learning of class balance under class-prior change by distribution matching, Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence, Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search, Semi-supervised speaker identification under covariate shift, Dimensionality reduction for density ratio estimation in high-dimensional spaces, Change-point detection in time-series data by relative density-ratio estimation, Least-Squares Independent Component Analysis, Off-policy temporal difference learning with distribution adaptation in fast mixing chains, Direct importance estimation for covariate shift adaptation, densratio, Domain Adaptation Using the Grassmann Manifold, Machine learning with squared-loss mutual information, Sequential minimal optimization in convex clustering repetitions, Sequential change‐point detection based on direct density‐ratio estimation, Constructive setting for problems of density ratio estimation, Relative Density-Ratio Estimation for Robust Distribution Comparison, Unnamed Item, Tnn: a transfer learning classifier based on weighted nearest neighbors, Semi-supervised logistic discrimination via labeled data and unlabeled data from different sampling distributions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Direct importance estimation for covariate shift adaptation
- On Kullback-Leibler loss and density estimation
- Sharper bounds for Gaussian and empirical processes
- Improving predictive inference under covariate shift by weighting the log-likelihood function
- A Bennett concentration inequality and its application to suprema of empirical processes
- Convergence rates for density estimation with Bernstein polynomials.
- Nonparametric and semiparametric models.
- Weak convergence and empirical processes. With applications to statistics
- Smoothed functional principal components analysis by choice of norm
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Local Rademacher complexities
- 10.1162/153244302760185252
- Input-dependent estimation of generalization error under covariate shift
- Asymptotic Properties of Maximum Likelihood Estimators and Likelihood Ratio Tests Under Nonstandard Conditions
- Sample Selection Bias as a Specification Error
- Improving the sample complexity using global data
- Trading Variance Reduction with Unbiasedness: The Regularized Subspace Information Criterion for Robust Model Selection in Kernel Regression
- Soft margins for AdaBoost
- New concentration inequalities in product spaces