High-dimensional Ising model selection using _1-regularized logistic regression
From MaRDI portal
Publication:973867
Abstract: We consider the problem of estimating the graph associated with a binary Ising Markov random field. We describe a method based on -regularized logistic regression, in which the neighborhood of any given node is estimated by performing logistic regression subject to an -constraint. The method is analyzed under high-dimensional scaling in which both the number of nodes and maximum neighborhood size are allowed to grow as a function of the number of observations . Our main results provide sufficient conditions on the triple and the model parameters for the method to succeed in consistently estimating the neighborhood of every node in the graph simultaneously. With coherence conditions imposed on the population Fisher information matrix, we prove that consistent neighborhood selection can be obtained for sample sizes with exponentially decaying error. When these same conditions are imposed directly on the sample matrices, we show that a reduced sample size of suffices for the method to estimate neighborhoods consistently. Although this paper focuses on the binary graphical models, we indicate how a generalization of the method of the paper would apply to general discrete Markov random fields.
Recommendations
- High-dimensional Ising model selection with Bayesian information criteria
- Sparse estimation in Ising model via penalized Monte Carlo methods
- High-dimensional structure estimation in Ising models: local separation criterion
- High-dimensional graphs and variable selection with the Lasso
- Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data
Cites work
- scientific article; zbMATH DE number 437298 (Why is no real title available?)
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 3934272 (Why is no real title available?)
- scientific article; zbMATH DE number 1206370 (Why is no real title available?)
- scientific article; zbMATH DE number 1408945 (Why is no real title available?)
- An interior-point method for large-scale \(l_1\)-regularized logistic regression
- Approximating discrete probability distributions with dependence trees
- Blockwise sparse regression
- Causation, prediction, and search
- Consistent estimation of the basic neighborhood of Markov random fields
- Convex Analysis
- Estimating high-dimensional directed acyclic graphs with the PC-algorithm
- Graphical models, exponential families, and variational inference
- High-dimensional graphs and variable selection with the Lasso
- Just relax: convex programming methods for identifying sparse signals in noise
- Learning factor graphs in polynomial time and sample complexity
- Local operator theory, random matrices and Banach spaces.
- Matrix Analysis
- Maximum likelihood bounded tree-width Markov networks
- Model Selection and Estimation in Regression with Grouped Variables
- Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Probability Inequalities for Sums of Bounded Random Variables
- Reconstruction of Markov random fields from samples: some observations and algorithms
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Sparse permutation invariant covariance estimation
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- Support union recovery in high-dimensional multivariate regression
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The Group Lasso for Logistic Regression
Cited in
(only showing first 100 items - show all)- Ising models for neural activity inferred via selective cluster expansion: structural and coding properties
- Lower bounds for testing graphical models: colorings and antiferromagnetic Ising models
- Multiclass analysis and prediction with network structured covariates
- Objective Bayesian edge screening and structure selection for Ising networks
- Inferring network structure in non-normal and mixed discrete-continuous genomic data
- The Dantzig selector for a linear model of diffusion processes
- Exact recovery in the Ising blockmodel
- scientific article; zbMATH DE number 7370619 (Why is no real title available?)
- Hinge-loss Markov random fields and probabilistic soft logic
- Analysis of noisy survival data with graphical proportional hazards measurement error models
- Pairwise sparse + low-rank models for variables of mixed type
- Learning a tree-structured Ising model in order to make predictions
- An oracle approach for interaction neighborhood estimation in random fields
- Inference of large modified Poisson-type graphical models: application to RNA-seq data in childhood atopic asthma studies
- Feature selection for data integration with mixed multiview data
- Joint estimation of parameters in Ising model
- Sparse Poisson regression with penalized weighted score function
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization
- Learning loosely connected Markov random fields
- Estimating networks with jumps
- Efficiently learning Ising models on arbitrary graphs (extended abstract)
- AMP chain graphs: minimal separators and structure learning algorithms
- scientific article; zbMATH DE number 7306910 (Why is no real title available?)
- Graphical Models and Message-Passing Algorithms: Some Introductory Lectures
- Estimating the interaction graph of stochastic neural dynamics
- Property testing in high-dimensional Ising models
- A decomposition-based algorithm for learning the structure of multivariate regression chain graphs
- De-noising analysis of noisy data under mixed graphical models
- Joint estimation of heterogeneous exponential Markov random fields through an approximate likelihood inference
- Provable training set debugging for linear regression
- Robust measurement via a fused latent and graphical item response theory model
- Structure learning of contextual Markov networks using marginal pseudo-likelihood
- Bayesian model selection for high-dimensional Ising models, with applications to educational data
- Identifying interacting pairs of sites in Ising models on a countable set
- Statistical analysis of sparse approximate factor models
- Discussion to: ``Bayesian graphical models for modern biological applications by Y. Ni, V. Baladandayuthapani, M. Vannucci and F. C. Stingo
- Sharp oracle inequalities and slope heuristic for specification probabilities estimation in discrete random fields
- scientific article; zbMATH DE number 7415078 (Why is no real title available?)
- Variable screening in multivariate linear regression with high-dimensional covariates
- Customer choice models vs. machine learning: finding optimal product displays on Alibaba
- High-dimensional structure learning of binary pairwise Markov networks: a comparative numerical study
- Inference under Fine-Gray competing risks model with high-dimensional covariates
- Unified analysis of stochastic gradient methods for composite convex and smooth optimization
- Parameter inference in a probabilistic model from data: regulation of transition rate in the Monte Carlo method
- A global approach for learning sparse Ising models
- Asymptotic theory of \(\ell_1\)-regularized PDE identification from a single noisy trajectory
- Sparse equisigned PCA: algorithms and performance bounds in the noisy rank-1 setting
- Bayesian Shrinkage for Functional Network Models, With Applications to Longitudinal Item Response Data
- Covariance structure approximation via gLasso in high-dimensional supervised classification
- A unified framework for structured graph learning via spectral constraints
- Composite mixture of log-linear models with application to psychiatric studies
- Robust Variable and Interaction Selection for Logistic Regression and General Index Models
- Parameter inference in a probabilistic model using clustered data
- Estimation of high-dimensional partially-observed discrete Markov random fields
- Comment on ``Hypothesis testing by convex optimization
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Sparse nonparametric graphical models
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- Probabilistic graphical models and Markov networks
- Adaptive cluster expansion for the inverse Ising problem: convergence, algorithm and tests
- Computational implications of reducing data to sufficient statistics
- Combinatorial approach to exactly solve the 1D Ising model
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- Learning quadratic variance function (QVF) DAG models via overdispersion scoring (ODS)
- Latent binary MRF for online reconstruction of large scale systems
- Dimension reduction and variable selection in case control studies via regularized likelihood optimization
- Graphical models for zero-inflated single cell gene expression
- Sparse directed acyclic graphs incorporating the covariates
- Universality of the mean-field for the Potts model
- High-dimensional Ising model selection with Bayesian information criteria
- Generalized score matching for non-negative data
- Local conditional and marginal approach to parameter estimation in discrete graphical models
- Estimation of high-dimensional graphical models using regularized score matching
- Statistical mechanics of the inverse Ising problem and the optimal objective function
- Bayesian structure learning in sparse Gaussian graphical models
- High-dimensional Poisson structural equation model learning via \(\ell_1\)-regularized regression
- Exponential-family models of random graphs: inference in finite, super and infinite population scenarios
- ROCKET: robust confidence intervals via Kendall's tau for transelliptical graphical models
- Sparse linear models and \(l_1\)-regularized 2SLS with high-dimensional endogenous regressors and instruments
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- Concentration and consistency results for canonical and curved exponential-family models of random graphs
- Simple method for inference in inverse Ising problem using full data
- A general algorithm for covariance modeling of discrete data
- Estimating heterogeneous graphical models for discrete data with an application to roll call voting
- Poisson dependency networks: gradient boosted models for multivariate count data
- Multivariate Bernoulli distribution
- Loglinear model selection and human mobility
- Sparse covariance estimation in heterogeneous samples
- Fluctuations in mean-field Ising models
- Bayesian graphical models for differential pathways
- Estimating time-varying networks
- Region selection in Markov random fields: Gaussian case
- Structure learning in inverse Ising problems using ℓ 2-regularized linear estimator
- Estimating finite mixtures of ordinal graphical models
- Tuning parameter calibration for \(\ell_1\)-regularized logistic regression
- High-dimensional structure estimation in Ising models: local separation criterion
- Graphical-model based high dimensional generalized linear models
- Network-based discriminant analysis for multiclassification
- Sparse and low-rank matrix regularization for learning time-varying Markov networks
This page was built for publication: High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q973867)