Abstract
We introduce BERT mutation, a novel, domain-independent mutation operator for Genetic Programming (GP) that leverages advanced Natural Language Processing (NLP) techniques to improve convergence, particularly using the Masked Language Modeling approach. By combining the capabilities of deep reinforcement learning and the BERT transformer architecture, BERT mutation intelligently suggests node replacements within GP trees to enhance their fitness. Unlike traditional stochastic mutation methods, BERT mutation adapts dynamically by using historical fitness data to optimize mutation decisions, resulting in more effective evolutionary improvements. Through comprehensive evaluations across three benchmark domains, we demonstrate that BERT mutation significantly outperforms conventional and state-of-the-art mutation operators in terms of convergence speed and solution quality. This work represents a pivotal step toward integrating state-of-the-art deep learning into evolutionary algorithms, pushing the boundaries of adaptive optimization in GP.
Original language | American English |
---|---|
Article number | 779 |
Journal | Mathematics |
Volume | 13 |
Issue number | 5 |
DOIs | |
State | Published - 1 Mar 2025 |
Keywords
- artificial ant
- combinatorial optimization
- genetic programming
- mutation operator
- reinforcement learning
- surrogate model
- symbolic classification
- symbolic regression
All Science Journal Classification (ASJC) codes
- Computer Science (miscellaneous)
- General Mathematics
- Engineering (miscellaneous)