Hysteresis Activation Function for Efficient Inference

Moshe Kimhi, Idan Kashani, Avi Mendelson, Chaim Baskin

Research output: Contribution to journalConference articlepeer-review

Abstract

The widely used ReLU is favored for its hardware efficiency, as the implementation at inference is a one bit sign case, yet suffers from issues such as the "dying ReLU" problem, where during training, neurons fail to activate and constantly remain at zero, as highlighted by Lu et al. [16]. Traditional approaches to mitigate this issue often introduce more complex and less hardware-friendly activation functions. In this work, we propose a Hysteresis Rectified Linear Unit (HeLU), an efficient activation function designed to address the "dying ReLU" problem with minimal complexity. Unlike traditional activation functions with fixed thresholds for training and inference, HeLU employs a variable threshold that refines the backpropagation. This refined mechanism allows simpler activation functions to achieve competitive performance comparable to their more complex counterparts without introducing unnecessary complexity or requiring inductive biases. Empirical evaluations demonstrate that HeLU enhances model generalization across diverse datasets, offering a promising solution for efficient and effective inference suitable for a wide range of neural network architectures.

Original languageAmerican English
Pages (from-to)414-422
Number of pages9
JournalProceedings of Machine Learning Research
Volume262
StatePublished - 1 Jan 2024
Event4th NeurIPS Efficient Natural Language and Speech Processing Workshop - Vancouver, Canada
Duration: 14 Dec 2024 → …

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Cite this