Learning quantitative sequence-function relationships from massively parallel experiments
From MaRDI portal
Abstract: A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships -- functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes" -- directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.
Recommendations
- Parametric inference in the large data limit using maximally informative models
- Statistical mechanics of transcription-factor binding site discovery using hidden Markov models
- The jigsaw puzzle of sequence phenotype inference: piecing together Shannon entropy, importance sampling, and empirical Bayes
- Learning a nonlinear dynamical system model of gene regulation: A perturbed steady-state approach
- Learning and inference in computational systems biology.
Cites work
- scientific article; zbMATH DE number 107482 (Why is no real title available?)
- Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions
- Equitability, mutual information, and the maximal information coefficient
- Field Theories for Learning Probability Distributions
- Learning quadratic receptive fields from neural responses to natural stimuli
- Parametric inference in the large data limit using maximally informative models
Cited in
(2)
This page was built for publication: Learning quantitative sequence-function relationships from massively parallel experiments
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q290475)