Deep learning Gauss-Manin connections
From MaRDI portal
Publication:2113261
Abstract: The Gauss-Manin connection of a family of hypersurfaces governs the change of the period matrix along the family. This connection can be complicated even when the equations defining the family look simple. When this is the case, it is computationally expensive to compute the period matrices of varieties in the family via homotopy continuation. We train neural networks that can quickly and reliably guess the complexity of the Gauss-Manin connection of a pencil of hypersurfaces. As an application, we compute the periods of 96% of smooth quartic surfaces in projective 3-space whose defining equation is a sum of five monomials; from the periods of these quartic surfaces, we extract their Picard numbers and the endomorphism fields of their transcendental lattices.
Recommendations
- Deep Neural Networks in a Mathematical Framework
- Mathematical methods in deep learning
- Mathematical Aspects of Deep Learning
- Estimation of a regression function on a manifold by fully connected deep neural networks
- Deep learning of conjugate mappings
- The Modern Mathematics of Deep Learning
- Differential Geometry Methods for Constructing Manifold-Targeted Recurrent Neural Networks
- Topological approaches to deep learning
- A singular Riemannian geometry approach to deep neural networks. I: Theoretical foundations
- Graph Laplacians, Riemannian manifolds, and their machine-learning
Cites work
- scientific article; zbMATH DE number 3749170 (Why is no real title available?)
- scientific article; zbMATH DE number 783783 (Why is no real title available?)
- scientific article; zbMATH DE number 5221670 (Why is no real title available?)
- A numerical transcendental method in algebraic geometry: computation of Picard groups and related invariants
- Algebraic geometry over the complex numbers
- Approximation by superpositions of a sigmoidal function
- Computing periods of hypersurfaces
- DGM: a deep learning algorithm for solving partial differential equations
- Deep learning
- Examples of \(K3\) surfaces with real multiplication
- Factoring polynomials with rational coefficients
- Hodge groups of K3 surfaces.
- Hodge theory and complex algebraic geometry. II. Transl. from the French by Leila Schneps
- Multilayer feedforward networks are universal approximators
- Networks and the best approximation property
- On reconstructing subvarieties from their periods
- On the differentiation of De Rham cohomology classes with respect to parameters
- On the periods of certain rational integrals. I, II
- Shortest paths algorithms: Theory and experimental evaluation
- The Magma algebra system. I: The user language
- Using machine learning to improve cylindrical algebraic decomposition
- Zeta functions of nondegenerate hypersurfaces in toric varieties via controlled reduction in \(p\)-adic cohomology
Cited in
(9)- Cluster algebras: network science and machine learning
- Advancing mathematics by guiding human intuition with AI
- Machine learning line bundle connections
- Topological approaches to deep learning
- From the string landscape to the mathematical landscape: a machine-learning outlook
- Deep learning the hyperbolic volume of a knot
- Is deep learning a useful tool for the pure mathematician?
- Ehresmann connections and feedforward neural networks
- Differential geometry and stochastic dynamics with deep learning numerics
This page was built for publication: Deep learning Gauss-Manin connections
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2113261)