Hybrid random fields. A scalable approach to structure and parameter learning in probabilistic graphical models (Q535474): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Set OpenAlex properties.
 
(3 intermediate revisions by 3 users not shown)
Property / review text
 
This book presents novel probabilistic graphical models, i.e., hybrid random fields. These graphical models realize the tight binding between probability and graph theory by representing random variables as the nodes of a graph, and statistical dependencies among the variables by the graph's edges. However, the expression ``graphical model'' includes a number of learning machines, such as inductive logic programming, statistical relational learning, machine learning over structured domains, recursive neural networks, graph neural networks, etc. Before given a more precise picture of this book, it may be good to point out a number of problems presented in the text. Chapter 2 introduces the idea of Bayesian networks in a qualitative manner. The given formal definition involves hybrid random fields, namely the Markov condition, \(d\)-separation, the Markov blanket, and their relationship with the factorization of the overall, joint distribution of the variables represented by the whole network. Additionally, the chapter gives a survey on generalizations of the basic model (such as dynamic Bayesian networks and hidden Markov models). Chapter 3 provides a step-by-step introduction to the major ideas behind the concept of Markov random field, starting from stochastic processes. The role of the potential energy of local neighborhoods over an undirected graph is pointed out, along with the motivations behind the choice of modelling the overall joint density via the Gibbs distribution. Moreover, the theoretical justification for the definition of Markov random fields in the light of the equivalence between the assumption of the Gibbs distribution and the emergence of the Markov property is provided. In Chapter 4, the hybrid random field (HRF) model is introduced. The HRF model represents the joint probability distribution of its variables over a set of random variables that satisfies certain conditions involving the concepts of directed and undirected union. The chapter discusses also the difference between this definition of HRF and a looser definition presented earlier in the literature. Chapter 5 generalizes the HRF model to domains involving continuous variables. In the given approach, some estimation techniques for modeling the conditional probability density functions associated with the nodes of the graphs are provided. An emphasis is put on normal distributions and Gaussian mixture models. The maximum-likelihood estimation of the parameters of normal densities is studied in detail. It is interesting to note that the ``change of variables'' theorem allows for the development of a non-paranormal technique for the estimation of joint and conditional probability density functions without making any prior assumption on the form of the modeled density functions. Chapter 6 treats learning machines that learn effectively and prove to be useful in real-world scenarios. Some experiments based on the identification of Markov blankets are described. Among others, the discrimination between positive and negative individuals in a in an application regarding lung cancer diagnosis is presented. These results bring evidence that hybrid random fields scale up well to large datasets, giving a level of accuracy which is at least as good as that achieved by traditional graphical models. Chapter 7 attempts to make the philosophical reflection on some issues concerning the relationships between statistical machine learning and the cognitive sciences. Chapter 8 summarizes the main contributions of the book. In summary, the authors have written a very valuable book -- rigorous in the treatment on the mathematical background, but also enriched with a very open view of the field, full of stimulating connections.
Property / review text: This book presents novel probabilistic graphical models, i.e., hybrid random fields. These graphical models realize the tight binding between probability and graph theory by representing random variables as the nodes of a graph, and statistical dependencies among the variables by the graph's edges. However, the expression ``graphical model'' includes a number of learning machines, such as inductive logic programming, statistical relational learning, machine learning over structured domains, recursive neural networks, graph neural networks, etc. Before given a more precise picture of this book, it may be good to point out a number of problems presented in the text. Chapter 2 introduces the idea of Bayesian networks in a qualitative manner. The given formal definition involves hybrid random fields, namely the Markov condition, \(d\)-separation, the Markov blanket, and their relationship with the factorization of the overall, joint distribution of the variables represented by the whole network. Additionally, the chapter gives a survey on generalizations of the basic model (such as dynamic Bayesian networks and hidden Markov models). Chapter 3 provides a step-by-step introduction to the major ideas behind the concept of Markov random field, starting from stochastic processes. The role of the potential energy of local neighborhoods over an undirected graph is pointed out, along with the motivations behind the choice of modelling the overall joint density via the Gibbs distribution. Moreover, the theoretical justification for the definition of Markov random fields in the light of the equivalence between the assumption of the Gibbs distribution and the emergence of the Markov property is provided. In Chapter 4, the hybrid random field (HRF) model is introduced. The HRF model represents the joint probability distribution of its variables over a set of random variables that satisfies certain conditions involving the concepts of directed and undirected union. The chapter discusses also the difference between this definition of HRF and a looser definition presented earlier in the literature. Chapter 5 generalizes the HRF model to domains involving continuous variables. In the given approach, some estimation techniques for modeling the conditional probability density functions associated with the nodes of the graphs are provided. An emphasis is put on normal distributions and Gaussian mixture models. The maximum-likelihood estimation of the parameters of normal densities is studied in detail. It is interesting to note that the ``change of variables'' theorem allows for the development of a non-paranormal technique for the estimation of joint and conditional probability density functions without making any prior assumption on the form of the modeled density functions. Chapter 6 treats learning machines that learn effectively and prove to be useful in real-world scenarios. Some experiments based on the identification of Markov blankets are described. Among others, the discrimination between positive and negative individuals in a in an application regarding lung cancer diagnosis is presented. These results bring evidence that hybrid random fields scale up well to large datasets, giving a level of accuracy which is at least as good as that achieved by traditional graphical models. Chapter 7 attempts to make the philosophical reflection on some issues concerning the relationships between statistical machine learning and the cognitive sciences. Chapter 8 summarizes the main contributions of the book. In summary, the authors have written a very valuable book -- rigorous in the treatment on the mathematical background, but also enriched with a very open view of the field, full of stimulating connections. / rank
 
Normal rank
Property / reviewed by
 
Property / reviewed by: Jerzy Martyna / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 60-02 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 60G60 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 62M02 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 62F15 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 68R10 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 68T05 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 05C90 / rank
 
Normal rank
Property / zbMATH DE Number
 
Property / zbMATH DE Number: 5887505 / rank
 
Normal rank
Property / zbMATH Keywords
 
research exposition
Property / zbMATH Keywords: research exposition / rank
 
Normal rank
Property / zbMATH Keywords
 
probability theory
Property / zbMATH Keywords: probability theory / rank
 
Normal rank
Property / zbMATH Keywords
 
random fields
Property / zbMATH Keywords: random fields / rank
 
Normal rank
Property / zbMATH Keywords
 
Markov processes
Property / zbMATH Keywords: Markov processes / rank
 
Normal rank
Property / zbMATH Keywords
 
hypothesis testing
Property / zbMATH Keywords: hypothesis testing / rank
 
Normal rank
Property / zbMATH Keywords
 
Bayesian inference
Property / zbMATH Keywords: Bayesian inference / rank
 
Normal rank
Property / zbMATH Keywords
 
graph theory
Property / zbMATH Keywords: graph theory / rank
 
Normal rank
Property / zbMATH Keywords
 
learning and adaptive systems
Property / zbMATH Keywords: learning and adaptive systems / rank
 
Normal rank
Property / zbMATH Keywords
 
applications
Property / zbMATH Keywords: applications / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/978-3-642-20308-4 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2501347266 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 22:36, 19 March 2024

scientific article
Language Label Description Also known as
English
Hybrid random fields. A scalable approach to structure and parameter learning in probabilistic graphical models
scientific article

    Statements

    Hybrid random fields. A scalable approach to structure and parameter learning in probabilistic graphical models (English)
    0 references
    0 references
    0 references
    12 May 2011
    0 references
    This book presents novel probabilistic graphical models, i.e., hybrid random fields. These graphical models realize the tight binding between probability and graph theory by representing random variables as the nodes of a graph, and statistical dependencies among the variables by the graph's edges. However, the expression ``graphical model'' includes a number of learning machines, such as inductive logic programming, statistical relational learning, machine learning over structured domains, recursive neural networks, graph neural networks, etc. Before given a more precise picture of this book, it may be good to point out a number of problems presented in the text. Chapter 2 introduces the idea of Bayesian networks in a qualitative manner. The given formal definition involves hybrid random fields, namely the Markov condition, \(d\)-separation, the Markov blanket, and their relationship with the factorization of the overall, joint distribution of the variables represented by the whole network. Additionally, the chapter gives a survey on generalizations of the basic model (such as dynamic Bayesian networks and hidden Markov models). Chapter 3 provides a step-by-step introduction to the major ideas behind the concept of Markov random field, starting from stochastic processes. The role of the potential energy of local neighborhoods over an undirected graph is pointed out, along with the motivations behind the choice of modelling the overall joint density via the Gibbs distribution. Moreover, the theoretical justification for the definition of Markov random fields in the light of the equivalence between the assumption of the Gibbs distribution and the emergence of the Markov property is provided. In Chapter 4, the hybrid random field (HRF) model is introduced. The HRF model represents the joint probability distribution of its variables over a set of random variables that satisfies certain conditions involving the concepts of directed and undirected union. The chapter discusses also the difference between this definition of HRF and a looser definition presented earlier in the literature. Chapter 5 generalizes the HRF model to domains involving continuous variables. In the given approach, some estimation techniques for modeling the conditional probability density functions associated with the nodes of the graphs are provided. An emphasis is put on normal distributions and Gaussian mixture models. The maximum-likelihood estimation of the parameters of normal densities is studied in detail. It is interesting to note that the ``change of variables'' theorem allows for the development of a non-paranormal technique for the estimation of joint and conditional probability density functions without making any prior assumption on the form of the modeled density functions. Chapter 6 treats learning machines that learn effectively and prove to be useful in real-world scenarios. Some experiments based on the identification of Markov blankets are described. Among others, the discrimination between positive and negative individuals in a in an application regarding lung cancer diagnosis is presented. These results bring evidence that hybrid random fields scale up well to large datasets, giving a level of accuracy which is at least as good as that achieved by traditional graphical models. Chapter 7 attempts to make the philosophical reflection on some issues concerning the relationships between statistical machine learning and the cognitive sciences. Chapter 8 summarizes the main contributions of the book. In summary, the authors have written a very valuable book -- rigorous in the treatment on the mathematical background, but also enriched with a very open view of the field, full of stimulating connections.
    0 references
    0 references
    research exposition
    0 references
    probability theory
    0 references
    random fields
    0 references
    Markov processes
    0 references
    hypothesis testing
    0 references
    Bayesian inference
    0 references
    graph theory
    0 references
    learning and adaptive systems
    0 references
    applications
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references