An ETF view of Dropout regularization

Dor Bank, Raja Giryes

Research output: Contribution to conferencePaperpeer-review


Dropout is a popular regularization technique in deep learning. Yet, the reason for its success is still not fully understood. This paper provides a new interpretation of Dropout from a frame theory perspective. By drawing a connection to recent developments in analog channel coding, we suggest that for a certain family of autoencoders with a linear encoder, optimizing the encoder with dropout regularization leads to an equiangular tight frame (ETF). Since this optimization is non-convex, we add another regularization that promotes such structures by minimizing the cross-correlation between filters in the network. We demonstrate its applicability in convolutional and fully connected layers in both feed-forward and recurrent networks. All these results suggest that there is indeed a relationship between dropout and ETF structure of the regularized linear operations.

Original languageEnglish
StatePublished - 2020
Event31st British Machine Vision Conference, BMVC 2020 - Virtual, Online
Duration: 7 Sep 202010 Sep 2020


Conference31st British Machine Vision Conference, BMVC 2020
CityVirtual, Online

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition


Dive into the research topics of 'An ETF view of Dropout regularization'. Together they form a unique fingerprint.

Cite this