BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic Programming

Eliad Shem-Tov, Moshe Sipper, Achiya Elyasaf

Research output: Contribution to journalArticlepeer-review

Abstract

We introduce BERT mutation, a novel, domain-independent mutation operator for Genetic Programming (GP) that leverages advanced Natural Language Processing (NLP) techniques to improve convergence, particularly using the Masked Language Modeling approach. By combining the capabilities of deep reinforcement learning and the BERT transformer architecture, BERT mutation intelligently suggests node replacements within GP trees to enhance their fitness. Unlike traditional stochastic mutation methods, BERT mutation adapts dynamically by using historical fitness data to optimize mutation decisions, resulting in more effective evolutionary improvements. Through comprehensive evaluations across three benchmark domains, we demonstrate that BERT mutation significantly outperforms conventional and state-of-the-art mutation operators in terms of convergence speed and solution quality. This work represents a pivotal step toward integrating state-of-the-art deep learning into evolutionary algorithms, pushing the boundaries of adaptive optimization in GP.

Original languageAmerican English
Article number779
JournalMathematics
Volume13
Issue number5
DOIs
StatePublished - 1 Mar 2025

Keywords

  • artificial ant
  • combinatorial optimization
  • genetic programming
  • mutation operator
  • reinforcement learning
  • surrogate model
  • symbolic classification
  • symbolic regression

All Science Journal Classification (ASJC) codes

  • Computer Science (miscellaneous)
  • General Mathematics
  • Engineering (miscellaneous)

Fingerprint

Dive into the research topics of 'BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic Programming'. Together they form a unique fingerprint.

Cite this