On the expressive power of deep learning: A tensor analysis

Research output: Contribution to journalConference articlepeer-review

Abstract

It has long been conjectured that hypotheses spaces suitable for data that is compositional in nature, such as text or images, may be more efficiently represented with deep hierarchical networks than with shallow ones. Despite the vast empirical evidence supporting this belief, theoretical justifications to date are limited. In particular, they do not account for the locality, sharing and pooling constructs of convolutional networks, the most successful deep learning architecture to date. In this work we derive a deep network architecture based on arithmetic circuits that inherently employs locality, sharing and pooling. An equivalence between the networks and hierarchical tensor factorizations is established. We show that a shallow network corresponds to CP (rank-1) decomposition, whereas a deep network corresponds to Hierarchical Tucker decomposition. Using tools from measure theory and matrix algebra, we prove that besides a negligible set, all functions that can be implemented by a deep network of polynomial size, require exponential size in order to be realized (or even approximated) by a shallow network. Since log-space computation transforms our networks into SimNets, the result applies directly to a deep learning architecture demonstrating promising empirical performance. The construction and theory developed in this paper shed new light on various practices and ideas employed by the deep learning community.

Original languageEnglish
Pages (from-to)698-728
Number of pages31
JournalJournal of Machine Learning Research
Volume49
Issue numberJune
StatePublished - 6 Jun 2016
Event29th Conference on Learning Theory, COLT 2016 - New York, United States
Duration: 23 Jun 201626 Jun 2016

Keywords

  • Arithmetic Circuits
  • Deep Learning
  • Expressive Power
  • Tensor Decompositions

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'On the expressive power of deep learning: A tensor analysis'. Together they form a unique fingerprint.

Cite this