Abstract
Activation functions (AFs) play a pivotal role in the performance of neural networks. The Rectified Linear Unit (ReLU) is currently the most commonly used AF. Several replacements to ReLU have been suggested but improvements have proven inconsistent. Some AFs exhibit better performance for specific tasks, but it is hard to know a priori how to select the appropriate one(s). Studying both standard fully connected neural networks (FCNs) and convolutional neural networks (CNNs), we propose a novel, three-population, co-evolutionary algorithm to evolve AFs, and compare it to four other methods, both evolutionary and non-evolutionary. Tested on four datasets---MNIST, FashionMNIST, KMNIST, and USPS---coevolution proves to be a performant algorithm for finding good AFs and AF architectures.
Original language | American English |
---|---|
Title of host publication | Proceedings of the Genetic and Evolutionary Computation Conference Companion GECCO 2022 |
Place of Publication | New York, NY, USA |
Pages | 2113-2121 |
Number of pages | 9 |
ISBN (Electronic) | 9781450392686 |
DOIs | |
State | Published - 19 Jul 2022 |
Event | 2022 Genetic and Evolutionary Computation Conference, GECCO 2022 - Virtual, Online, United States Duration: 9 Jul 2022 → 13 Jul 2022 |
Conference
Conference | 2022 Genetic and Evolutionary Computation Conference, GECCO 2022 |
---|---|
Country/Territory | United States |
City | Virtual, Online |
Period | 9/07/22 → 13/07/22 |
Keywords
- Computer graphics
- Computing methodologies
- Design and analysis of algorithms
- Discrete optimization
- Image manipulation
- Image processing
- Machine learning
- Mathematical optimization
- Optimization with randomized search heuristics
- Theory of computation