Optimal filtering of square-integrable signals in Gaussian noise

From MaRDI portal
Publication:1148851

zbMath0452.94003MaRDI QIDQ1148851

M. S. Pinsker

Publication date: 1980

Published in: Problems of Information Transmission (Search for Journal in Brave)

Full work available at URL: http://mathnet.ru/eng/ppi/v16/i2/p52



Related Items

Minimax nonparametric estimation of pure quantum states, Bayesian nonparametric point estimation under a conjugate prior, Rates of convergence for minimum contrast estimators, Adaptive estimates of linear functionals, Density estimation for biased data., Asymptotic minimax risk for sup-norm loss: Solution via optimal recovery, Minimax risk over \(l_ p\)-balls for \(l_ q\)-error, Statistical properties of the method of regularization with periodic Gaussian reproducing kernel, Obtaining minimax lower bounds: a review, Estimation of the density of regression errors, Nonparametric estimation by convex programming, Adaptive efficient analysis for big data ergodic diffusion models, Weyl eigenvalue asymptotics and sharp adaptation on vector bundles, Optimal measurement allocation under precision budget constraint, Minimax estimation of continuous time deterministic signals in colored noise, Asymptotic equivalence of density estimation and Gaussian white noise, Optimal estimation in additive regression models, Nonlinear estimation over weak Besov spaces and minimax Bayes, Hybid shrinkage estimators using penalty bases for the ordinal one-way layout, Complexity of linear ill-posed problems in Hilbert space, Nonparametric regression with the scale depending on auxiliary variable, On universal oracle inequalities related to high-dimensional linear models, Empirical risk minimization as parameter choice rule for general linear regularization methods, On the lower bound in second order estimation for Poisson processes: asymptotic efficiency, Asymptotically sufficient statistics in nonparametric regression experiments with correlated noise, Nearly optimal minimax estimator for high-dimensional sparse linear regression, Near-optimality of linear recovery in Gaussian observation scheme under \(\| \cdot \|_{2}^{2}\)-loss, Efficient robust nonparametric estimation in a semimartingale regression model, Minimax rates for statistical inverse problems under general source conditions, Robust estimation in inverse problems via quantile coupling, Sharp optimality for regression with real-time data, Estimator selection: a new method with applications to kernel density estimation, Adaptive Bayesian inference in the Gaussian sequence model using exponential-variance priors, Efficient shrinkage in parametric models, Sharp oracle inequalities for aggregation of affine estimators, Honest Bayesian confidence sets for the \(L^2\)-norm, Minimax nonparametric estimation on maxisets, From Gauss to Kolmogorov: localized measures of complexity for ellipses, General model selection estimation of a periodic regression with a Gaussian noise, Minimax testing of a composite null hypothesis defined via a quadratic functional in the model of regression, Empirical Bayes scaling of Gaussian priors in the white noise model, On asymptotically minimax nonparametric detection of signal in Gaussian white noise, Asymptotic equivalence of functional linear regression and a white noise inverse problem, Estimation and detection of functions from anisotropic Sobolev classes, Adaptive spectral regularizations of high dimensional linear models, Asymptotic minimax risk of predictive density estimation for non-parametric regression, Nonparametric signal detection with small type I and type II error probabilities, An asymptotically minimax kernel machine, Nonparametric estimation of the density of regression errors, Estimation and detection of high-variable functions from Sloan-Woźniakowski space, Adaptive Bayesian inference on the mean of an infinite-dimensional normal distribution, Robust model selection for a semimartingale continuous time regression from discrete data, Ridgelets: estimating with ridge functions, Discretization effects in statistical inverse problems, Adaptive estimation of and oracle inequalities for probability densities and characteristic functions, Minimax detection of a signal in \(l_ p\)-metrics, Density estimation in Besov spaces, Near-optimality of linear recovery from indirect observations, The minimax estimator of the pseudo-periodic function observed in the stationary noise, Estimation of the error density in a semiparametric transformation model, On estimation of time dependent spatial signal in Gaussian white noise., Periodic boxcar deconvolution and Diophantine approximation, Spectral cut-off regularizations for ill-posed linear models, Transition density estimation for stochastic differential equations via forward-reverse represen\-ta\-tions, The principle of penalized empirical risk in severely ill-posed problems, Nonparametric denoising of signals of unknown local structure. II: Nonparametric function recovery, Nonparametric estimation over shrinking neighborhoods: superefficiency and adaptation, Estimating a mean matrix: boosting efficiency by multiple affine shrinkage, Linear and convex aggregation of density estimators, On polyhedral estimation of signals via indirect observations, Minimax estimation via wavelet shrinkage, Modulation of estimators and confidence sets., Empirical Bayesian test of the smoothness, Asymptotically minimax tests for nonparametric hypotheses concerning the distribution density, On nonparametric regression for iid observations in a general setting, Confidence sets centered at \(C_ p\)-estimators, Minimax quadratic estimation of a quadratic functional, On sharp nonparametric estimation of differentiable functions, Asymptotic equivalence and adaptive estimation for robust nonparametric regression, Asymptotic normality of posterior distributions for exponential families when the number of parameters tends to infinity., Minimax estimation in linear regression under restrictions, Sharp linear and block shrinkage wavelet estimation., Bayesian aspects of some nonparametric problems, Efficient estimation of a density in a problem of tomography., Adaptive prediction and estimation in linear regression with infinitely many parameters., The statistical work of Lucien Le Cam., Asymptotic equivalence theory for nonparametric regression with random design, Asymptotic equivalence of estimating a Poisson intensity and a positive diffusion drift, Recovering edges in ill-posed inverse problems: Optimality of curvelet frames., Oracle inequalities for inverse problems, Locally minimax efficiency of nonparametric density estimators for \(\chi^2\)-type losses, Pointwise and sup-norm sharp adaptive estimation of functions on the Sobolev classes, Relaxing the Gaussian assumption in shrinkage and SURE in high dimension, Optimal prediction for linear regression with infinitely many parameters., Estimation in ill-posed linear models with nuisance design, ASP fits to multi-way layouts, Second order asymptotical efficiency for a Poisson process, A general approach of least squares estimation and optimal filtering, Sharp adaptive estimation of the drift function for ergodic diffusions, Multivariate Bayesian function estimation, ADAPTIVE ESTIMATION IN A HETEROSCEDASTIC NONPARAMETRIC REGRESSION, Minimax and adaptive inference in nonparametric function estimation, From minimax shrinkage estimation to minimax shrinkage prediction, Sequential nonparametric estimation of controlled multivariate regression, Sharp minimax distribution estimation for current status censoring with or without missing, Unnamed Item, Unnamed Item, Adaptive denoising of signals with local shift-invariant structure, Adaptive minimax optimality in statistical inverse problems via SOLIT—Sharp Optimal Lepskiĭ-Inspired Tuning, Nonparametric regression with responses missing at random and the scale depending on auxiliary covariates, High-dimensional Bernstein-von Mises theorem for the Diaconis-Ylvisaker prior, Nonparametric hypothesis testing with small type I or type II error probabilities, Oracle convergence rate of posterior under projection prior and Bayesian model selection, Stein shrinkage and second-order efficiency for semiparametric estimation of the shift, Discrepancy based model selection in statistical inverse problems, Aggregation of affine estimators, Model selection for density estimation with \(\mathbb L_2\)-loss, Estimation and detection of a function from tensor product spaces, Minimax risk over quadratically convex sets, Asymptotic approximation of nonparametric regression experiments with unknown variances, Semi-parametric second-order efficient estimation of the period of a signal, Conditional density estimation in a regression setting, Sharp adaptation for spherical inverse problems with applications to medical imaging, Penalized maximum likelihood and semiparametric second-order efficiency, Adaptive nonparametric confidence sets, Maximal spaces with given rate of convergence for thresholding algorithms, Adaptive minimax estimation of a fractional derivative, Random thresholds for linear model selection, Minimax and bayes estimation in deconvolution problem, Adaptive asymptotically efficient estimation in heteroscedastic nonparametric regression, Ridge regression and asymptotic minimax estimation over spheres of growing dimension, Minimax theory of nonparametric hazard rate estimation: efficiency and adaptation, Unnamed Item, Improved robust model selection methods for a Lévy nonparametric regression in continuous time, Exact asymptotics for estimating the marginal density of discretely observed diffusion proc\-esses, [https://portal.mardi4nfdi.de/wiki/Publication:4743580 Approximation dans les espaces m�triques et th�orie de l'estimation], A modified discrepancy principle to attain optimal convergence rates under unknown noise, The Risk of James–Stein and Lasso Shrinkage, Minimax nonparametric hypothesis testing for ellipsoids and Besov bodies, SDE Based Regression for Linear Random PDEs