scientific article; zbMATH DE number 6276134
From MaRDI portal
Publication:5405143
zbMath1283.62095MaRDI QIDQ5405143
Karsten M. Borgwardt, Bernhard Schölkopf, Arthur Gretton, Malte J. Rasch, Alexander J. Smola
Publication date: 1 April 2014
Full work available at URL: http://www.jmlr.org/papers/v13/gretton12a.html
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
hypothesis testingkernel methodstwo-sample testschema matchingintegral probability metricuniform convergence bounds
Nonparametric regression and quantile regression (62G08) Nonparametric hypothesis testing (62G10) Large deviations (60F10)
Related Items
A robust and nonparametric two-sample test in high dimensions, Graph-based two-sample tests for data with repeated observations, Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings, Unified field theoretical approach to deep and recurrent neuronal networks, Bayesian Kernel Two-Sample Testing, Log-Rank-Type Tests for Equality of Distributions in High-Dimensional Spaces, Optimal multiple change-point detection for high-dimensional data, Quasi-Random Sampling for Multivariate Distributions via Generative Neural Networks, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Comparing two populations using Bayesian Fourier series density estimation, Generative Adversarial Network for Probabilistic Forecast of Random Dynamical Systems, Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes, Distributionally robust unsupervised domain adaptation, Distributed inference for two‐sample U‐statistics in massive data analysis, A moment-matching metric for latent variable generative models, On the capacity of deep generative networks for approximating distributions, Adversarial strategy for transductive zero-shot learning, Measuring, Testing, and Identifying Heterogeneity of Large Parallel Datasets, Lead-lag detection and network clustering for multivariate time series with an application to the us equity market, Detecting distributional differences in labeled sequence data with application to tropical cyclone satellite imagery, Adversarial learning for counterfactual fairness, Sparse machine learning in Banach spaces, Multivariate Rank-Based Distribution-Free Nonparametric Testing Using Measure Transportation, Optimal Reaction Coordinates: Variational Characterization and Sparse Computation, Test for homogeneity of random objects on manifolds with applications to biological shape analysis, EuMMD: efficiently computing the MMD two-sample test statistic for univariate data, Nonasymptotic one- and two-sample tests in high dimension with unknown covariance structure, Unnamed Item, Stein variational gradient descent with learned direction, Weighted signature kernels, Chi-squared test for hypothesis testing of homogeneity, Domain adversarial neural networks for domain generalization: when it works and how to improve, Mirror variational transport: a particle-based algorithm for distributional optimization on constrained domains, Clustering multivariate time series using energy distance, Geometry of EM and related iterative algorithms, Self-supervised Metric Learning in Multi-View Data: A Downstream Task Perspective, Online MCMC Thinning with Kernelized Stein Discrepancy, Nonlinear directed acyclic graph estimation based on the kernel partial correlation coefficient, Estimating Optimal Infinite Horizon Dynamic Treatment Regimes via pT-Learning, Regularising inverse problems with generative machine learning models, Measuring and testing homogeneity of distributions by characteristic distance, Goal-oriented sensitivity analysis of hyperparameters in deep learning, Geometrical Insights for Implicit Generative Modeling, Weighted bootstrap for two-sample \(U\)-statistics, Level sets semimetrics for probability measures with applications in hypothesis testing, A Kernel Log-Rank Test of Independence for Right-Censored Data, Generalized martingale difference divergence: detecting conditional mean independence with applications in variable screening, Minimax rate of distribution estimation on unknown submanifolds under adversarial losses, Dimension-agnostic inference using cross U-statistics, Limiting distributions of graph-based test statistics on sparse and dense graphs, A hyperbolic divergence based nonparametric test for two‐sample multivariate distributions, Dependence Model Assessment and Selection with DecoupleNets, A machine learning framework for geodesics under spherical Wasserstein-Fisher-Rao metric and its application for weighted sample generation, Feature engineering with regularity structures, Some models are useful, but how do we know which ones? Towards a unified Bayesian model taxonomy, Coefficient-based regularized distribution regression, Testing homogeneity in high dimensional data through random projections, Testing equality of several distributions in separable metric spaces: a maximum mean discrepancy based approach, Hilbert C∗-Module for Analyzing Structured Data, Manifold energy two-sample test, Characteristic kernels on Hilbert spaces, Banach spaces, and on sets of measures, Unnamed Item, Unnamed Item, Estimation of Copulas via Maximum Mean Discrepancy, Expected Conditional Characteristic Function-based Measures for Testing Independence, FastMMD: Ensemble of Circular Discrepancy for Efficient Two-Sample Test, Causal Discovery via Reproducing Kernel Hilbert Space Embeddings, Deep Knockoffs, Dimension Reduction for Gaussian Process Emulation: An Application to the Influence of Bathymetry on Tsunami Heights, Deep Semisupervised Zero-Shot Learning with Maximum Mean Discrepancy, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Statistical inference on random dot product graphs: a survey, Characteristic and Universal Tensor Product Kernels, Computing functions of random variables via reproducing kernel Hilbert space representations, Scan B-statistic for kernel change-point detection, Unnamed Item, Two‐sample test based on classification probability, Global Sensitivity Analysis for Optimization with Variable Selection, Kernel-Based Tests for Joint Independence, Minimax Estimation of Kernel Mean Embeddings, Balanced joint maximum mean discrepancy for deep transfer learning, Graphical Models for Processing Missing Data, Unnamed Item, Unnamed Item, Stabilizing Invertible Neural Networks Using Mixture Models, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, A Generalized Kernel Method for Global Sensitivity Analysis, Mean-Field Controls with Q-Learning for Cooperative MARL: Convergence and Complexity Analysis, Bi-fidelity variational auto-encoder for uncertainty quantification, Scalable kernel two-sample tests via empirical likelihood and jackknife, Minimax optimality of permutation tests, Asymptotic normality of a generalized maximum mean discrepancy estimator, On the optimal estimation of probability measures in weak and strong topologies, On the use of random forest for two-sample testing, Nonparametric feature selection by random forests and deep neural networks, On Gaussian kernels on Hilbert spaces and kernels on hyperbolic spaces, A comparison of likelihood-free methods with and without summary statistics, The randomized information coefficient: assessing dependencies in noisy data, Interpretable domain adaptation via optimization over the Stiefel manifold, On kernel methods for covariates that are rankings, Challenges in Markov chain Monte Carlo for Bayesian neural networks, Classification accuracy as a proxy for two-sample testing, Two-sample test for equal distributions in separate metric space: New maximum mean discrepancy based approaches, The two-sample problem for Poisson processes: adaptive tests with a nonasymptotic wild bootstrap approach, Directional differentiability for supremum-type functionals: statistical applications, WIKS: a general Bayesian nonparametric index for quantifying differences between two populations, Born machine model based on matrix product state quantum circuit, Comparison of a large number of regression curves, Principled analytic classifier for positive-unlabeled learning via weighted integral probability metric, A \(k\)-sample test for functional data based on generalized maximum mean discrepancy, A general Monte Carlo method for multivariate goodness-of-fit testing applied to elliptical families, Robust adaptation regularization based on within-class scatter for domain adaptation, High dimensional two-sample test based on the inter-point distance, A one-sample test for normality with kernel methods, Bayesian nonparametric test for independence between random vectors, Near-optimal coresets of kernel density estimates, PI-VAE: physics-informed variational auto-encoder for stochastic differential equations, Expected similarity estimation for large-scale batch and streaming anomaly detection, Resampling approach for cluster model selection, Stein's method meets computational statistics: a review of some recent developments, Deep physics corrector: a physics enhanced deep learning architecture for solving stochastic differential equations, Learning kernels for unsupervised domain adaptation with applications to visual object recognition, Asymptotics and practical aspects of testing normality with kernel methods, Large-scale kernel methods for independence testing, Multivariate tests of independence based on a new class of measures of independence in reproducing kernel Hilbert space, A \(U\)-statistic approach for a high-dimensional two-sample mean testing problem under non-normality and Behrens-Fisher setting, A Hilbert Space Embedding for Distributions, On the empirical estimation of integral probability metrics, A note on microlocal kernel design for some slow-fast stochastic differential equations with critical transitions and application to EEG signals, On some consistent tests of mutual independence among several random vectors of arbitrary dimensions, Comparing a large number of multivariate distributions, Asymptotic distribution and detection thresholds for two-sample tests based on geometric graphs, Equitability, interval estimation, and statistical power, Model-free inference of diffusion networks using RKHS embeddings, The affinely invariant distance correlation, Covariate balancing propensity score by tailored loss functions, Asymptotics, finite-sample comparisons and applications for two-sample tests with functional data, Unsupervised group matching with application to cross-lingual topic matching without alignment information, Assessing similarity of random sets via skeletons, Testing equality of distributions of random convex compact sets via theory of \(\mathfrak{N} \)-distances, On uniform consistency of nonparametric tests. I, Stein variational gradient descent with local approximations, Kernel Distribution Embeddings: Universal Kernels, Characteristic Kernels and Kernel Metrics on Distributions, Distance-based and RKHS-based dependence metrics in high dimension, Robust multivariate nonparametric tests via projection averaging, Some tests of independence based on maximum mean discrepancy and ranks of nearest neighbors, On the expectation of a persistence diagram by the persistence weighted kernel, A Kernel Multiple Change-point Algorithm via Model Selection, Equivalence of distance-based and RKHS-based statistics in hypothesis testing, Finding robust transfer features for unsupervised domain adaptation, Dimensionality reduction of complex metastable systems via kernel embeddings of transition manifolds, Identifying outliers using multiple kernel canonical correlation analysis with application to imaging genetics, On some characterizations and multidimensional criteria for testing homogeneity, symmetry and independence, Generative adversarial networks with joint distribution moment matching, Assessment of ordered sequential data assimilation, The classification permutation test: a flexible approach to testing for covariate imbalance in observational studies, Global and local two-sample tests via regression, Two-sample Hypothesis Testing for Inhomogeneous Random Graphs, Deep graph similarity learning: a survey, Interpoint distance based two sample tests in high dimension, Model-free two-sample test for network-valued data, Convergence analysis of deterministic kernel-based quadrature rules in misspecified settings, On high dimensional two-sample tests based on nearest neighbors, Bayesian optimization with approximate set kernels, On some graph-based two-sample tests for high dimension, low sample size data, Multiview Alignment and Generation in CCA via Consistent Latent Encoding, Inferring 3D shapes from image collections using adversarial networks, Outlier detection in non-elliptical data by kernel MRCD, Scalable Bayesian Nonparametric Clustering and Classification, Product-form estimators: exploiting independence to scale up Monte Carlo, Robust comparison of kernel densities on spherical domains, Generalization error of GAN from the discriminator's perspective, A rank-based Cramér-von-Mises-type test for two samples, Finite sample properties of parametric MMD estimation: robustness to misspecification and dependence, Nuclear discrepancy for single-shot batch active learning, A new framework for distance and kernel-based metrics in high dimensions, A deep learning framework for hybrid heterogeneous transfer learning, Learning dynamical systems from data: a simple cross-validation perspective. I: Parametric kernel flows, Two-sample tests for multivariate repeated measurements of histogram objects with applications to wearable device data, Model-based kernel sum rule: kernel Bayesian inference with probabilistic models, Some new copula based distribution-free tests of independence among several random variables, Antithetic and Monte Carlo kernel estimators for partial rankings, Multi-sample comparison using spatial signs for infinite dimensional data, An Omnibus Non-Parametric Test of Equality in Distribution for Unknown Functions, Statistical distances in goodness-of-fit, Mathematical modeling of cancer signaling addressing tumor heterogeneity, A regression perspective on generalized distance covariance and the Hilbert-Schmidt independence criterion, A hierarchically low-rank optimal transport dissimilarity measure for structured data, High-dimensional variable screening through kernel-based conditional mean dependence, Local permutation tests for conditional independence
Uses Software