Significantly, almost all of the entrants used a variant of an approach known as a convolutional neural network (ConvNet), an approach first refined in 1998 by Yann LeCun, NYU professor, recently hired to head Facebook AI Research Center.
Here are additional resources to learn more about Deep Learning and convolutional neural network
- My post Where to Learn Deep Learning - Courses, Tutorials, Software
- KDnuggets Exclusive Interview with Yann LeCun, part 1, andPart 2
Book draft on DEEP LEARNING, by Yoshua Bengio, Ian Goodfellow and Aaron Courville
Software:
- cuda-convnet, a fast C++ implementation of convolutional (or more generally, feed-forward) neural networks
- Matlab/Octave toolbox for deep learning.
- pylearn2 page on GitHub, a machine learning library in Python which has maxout code for Deep Learning
Yann LeCun talks, including
- Learning Feature Hierarchies for Vision
- Building Artificial Vision Systems with Machine Learning
- 5 lectures at PCMI Summer school: Introduction, Energy-Based Learning, Multi-Stage Learning, Convolutional Networks, Unsupervised Deep Learning and more
Fundamental papers on Deep Learning - from Quora answer by Yoshua Bengio.
Here are selected papers:
- A fast learning algorithm for deep belief nets, by Geoff Hinton et al
- Training a deep autoencoder or a classifier on MNIST digits, code by Ruslan Salakhutdinov and Geoff Hinton
- Deep Boltzmann Machines, by by Ruslan Salakhutdinov and Geoff Hinton
- Learning Deep Boltzmann Machines, code by Ruslan Salakhutdinov
- Maxout Networks, by Ian Goodfellow et al
Excellent Introduction to Deep Learning: From Perceptrons to Deep Networks, by Ivan Vasilev at Toptal
Another excellent presentation Introduction to Parallel Iterative Deep Learning on Hadoop Next-Generation YARN Framework, by Adam Gibson and Josh Patterson.