Abstract
We consider Evolution Strategies (ESs) operating only with isotropic Gaussian mutations on positive quadratic objective functions, and investigate the covariance matrix when constructed out of selected individuals by truncation. We prove that the covariance matrix over (1,λ)-selected decision vectors becomes proportional to the inverse of the landscape Hessian as the population-size λ increases. This confirms a classical hypothesis that statistical learning of the landscape is an inherent characteristic of standard ESs, and that this distinguishing capability stems only from the usage of isotropic Gaussian mutations and rank-based selection. Even though the model under consideration does not precisely conform with practically encountered scenarios, it plays a role of a theoretical foundation for learning capabilities within ESs. We also provide broad numerical validation for the proven results, and present empirical evidence for its generalization to (μ,λ)-selection.
Original language | English |
---|---|
Pages (from-to) | 157-174 |
Number of pages | 18 |
Journal | Theoretical Computer Science |
Volume | 801 |
DOIs | |
State | Published - 1 Jan 2020 |
Keywords
- Covariance matrix
- Inverse-relation
- Landscape Hessian
- Statistical learning
- Theory of evolution strategies
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- General Computer Science