Two-layer contractive encodings for learning stable nonlinear features
From MaRDI portal
Publication:890731
DOI10.1016/j.neunet.2014.09.008zbMath1325.68196OpenAlexW2139336101WikidataQ50630147 ScholiaQ50630147MaRDI QIDQ890731
Sven Behnke, Tapani Raiko, Hannes Schulz, Kyung Hyun Cho
Publication date: 11 November 2015
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2014.09.008
linear transformationsemi-supervised learningdeep learningmulti-layer perceptronpretrainingtwo-layer contractive encoding
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Multilayer feedforward networks are universal approximators
- An Efficient Learning Procedure for Deep Boltzmann Machines
- On the Expressive Power of Deep Architectures
- Reducing the Dimensionality of Data with Neural Networks
- Learning Deep Architectures for AI
- Representational Power of Restricted Boltzmann Machines and Deep Belief Networks
- Enhanced Gradient for Training Restricted Boltzmann Machines
This page was built for publication: Two-layer contractive encodings for learning stable nonlinear features