Mathematical Research Data Initiative
Main page
Recent changes
Random page
SPARQL
MaRDI@GitHub
New item
Special pages
In other projects
MaRDI portal item
Discussion
View source
View history
English
Log in

Principles for initialization and architecture selection in graph neural networks with ReLU activations

From MaRDI portal
Publication:6664474
Jump to:navigation, search

DOI10.1137/23M1600621MaRDI QIDQ6664474FDOQ6664474


Authors: Gage DeZoort, Boris Hanin Edit this on Wikidata


Publication date: 16 January 2025

Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)





Recommendations

  • Theory of graph neural networks: representation and learning
  • The logic of graph neural networks
  • Activation function design for deep networks: linearity and effective initialisation
  • \(k\)-hop graph neural networks


zbMATH Keywords

initializationoversmoothinggraph neural networks


Mathematics Subject Classification ID

Artificial neural networks and deep learning (68T07) Artificial intelligence (68T99)


Cites Work

  • Sharp nonasymptotic bounds on the norm of random matrices with independent entries
  • Limit of the smallest eigenvalue of a large dimensional sample covariance matrix
  • Products of many large random matrices and gradients in deep neural networks
  • The Principles of Deep Learning Theory






This page was built for publication: Principles for initialization and architecture selection in graph neural networks with ReLU activations

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6664474)

Retrieved from "https://portal.mardi4nfdi.de/w/index.php?title=Publication:6664474&oldid=40243987"
Tools
What links here
Related changes
Printable version
Permanent link
Page information
This page was last edited on 13 February 2025, at 20:24. Warning: Page may not contain recent updates.
Privacy policy
About MaRDI portal
Disclaimers
Imprint
Powered by MediaWiki