Highway State Gating for Recurrent Highway Networks: Improving Information Flow Through Time

Ron Shoham, Haim Permuter

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Recurrent Neural Networks (RNNs) play a major role in the field of sequential learning, and have outperformed traditional algorithms on many benchmarks. Training deep RNNs still remains a challenge, and most of the state-of-the-art models are structured with a transition depth of 2–4 layers. Recurrent Highway Networks (RHNs) were introduced in order to tackle this issue. These have achieved state-of-the-art performance on a few benchmarks using a depth of 10 layers. However, the performance of this architecture suffers from a bottleneck, and ceases to improve when an attempt is made to add more layers. In this work, we analyze the causes for this, and postulate that the main source is the way that the information flows through time. We introduce a novel and simple variation for the RHN cell, called Highway State Gating (HSG), which allows adding more layers, while continuing to improve performance. By using a gating mechanism for the state, we allow the net to “choose” whether to pass information directly through time, or to gate it. This mechanism also allows the gradient to back-propagate directly through time and, therefore, results in a slightly faster convergence. We use the Penn Treebank (PTB) dataset as a platform for empirical proof of concept. Empirical results show that the improvement due to Highway State Gating is for all depths, and as the depth increases, the improvement also increases.

Original languageAmerican English
Title of host publicationCyber Security Cryptography and Machine Learning - Second International Symposium, CSCML 2018, Proceedings
EditorsItai Dinur, Shlomi Dolev, Sachin Lodha
PublisherSpringer Verlag
Pages120-128
Number of pages9
ISBN (Print)9783319941462
DOIs
StatePublished - 1 Jan 2018
Event2nd International Symposium on Cyber Security Cryptography and Machine Learning, CSCML 2018 - Beer-Sheva, Israel
Duration: 21 Jun 201822 Jun 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10879 LNCS

Conference

Conference2nd International Symposium on Cyber Security Cryptography and Machine Learning, CSCML 2018
Country/TerritoryIsrael
CityBeer-Sheva
Period21/06/1822/06/18

Keywords

  • Deep learning
  • Machine learning
  • Recurrent Highway Network
  • Recurrent Neural Networks
  • Sequential learning

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Highway State Gating for Recurrent Highway Networks: Improving Information Flow Through Time'. Together they form a unique fingerprint.

Cite this