Estimation of high-dimensional partially-observed discrete Markov random fields
From MaRDI portal
Publication:470504
DOI10.1214/14-EJS946zbMATH Open1302.62206arXiv1108.2835WikidataQ105584274 ScholiaQ105584274MaRDI QIDQ470504FDOQ470504
Authors: Yves F. Atchadé
Publication date: 12 November 2014
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Abstract: We consider the estimation of high-dimensional network structures from partially observed Markov random field data using a penalized pseudo-likelihood approach. We fit a misspecified model obtained by ignoring the missing data problem. We study the consistency of the estimator and derive a bound on its rate of convergence. The results obtained relate the rate of convergence of the estimator to the extent of the missing data problem. We report some simulation results that empirically validate some of the theoretical findings.
Full work available at URL: https://arxiv.org/abs/1108.2835
Recommendations
- Parameter estimation for Gibbs distributions from partially observed data
- High-dimensional covariance matrix estimation with missing observations
- Sparse estimation in Ising model via penalized Monte Carlo methods
- scientific article; zbMATH DE number 218215
- Reconstruction of Markov Random Fields from Samples: Some Observations and Algorithms
high-dimensional inferencepseudo-likelihoodmisspecificationMarkov random fieldsnetwork estimationpenalized likelihood inference
Cites Work
- Lasso-type recovery of sparse representations for high-dimensional data
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Sparsistency and rates of convergence in large covariance matrix estimation
- Sparse permutation invariant covariance estimation
- Regularized estimation of large covariance matrices
- Estimation of sparse binary pairwise Markov networks using pseudo-likelihoods
- Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data
- Model selection and estimation in the Gaussian graphical model
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- First-Order Methods for Sparse Covariance Selection
- Rejoinder: Latent variable graphical model selection via convex optimization
- Model selection for Gaussian concentration graphs
- Pairwise Variable Selection for High-Dimensional Model-Based Clustering
- Gibbs measures and phase transitions
- Self-concordant analysis for logistic regression
- Nonconcave penalized composite conditional likelihood estimation of sparse Ising models
Cited In (4)
- Log-determinant relaxation for approximate inference in discrete Markov random fields
- Markov Neighborhood Regression for High-Dimensional Inference
- Structure recovery for partially observed discrete Markov random fields on graphs under not necessarily positive distributions
- Model selection for Markov random fields on graphs under a mixing condition
This page was built for publication: Estimation of high-dimensional partially-observed discrete Markov random fields
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q470504)