Generative Fixation

A Unified Explanation for the Adaptive Capacity of
Simple Recombinative Genetic Algorithms

A Doctoral Dissertation by

Keki M. Burjorjee

Computer Science Department,
Brandeis University

August, 2009

Download Dissertation

Derivative Publications








Simple genetic algorithms have been used in a wide range of engineering and scientific fields to quickly procure useful solutions to poorly understood optimization problems. Unfortunately, despite the routine use of these algorithms for over three decades, their remarkable adaptive capacity has not been adequately accounted for. In this dissertation, I develop, submit, and support the generative fixation hypothesis—a unified explanation for the adaptive capacity of the simple genetic algorithm.

The generative fixation hypothesis is based on the inference of a close relationship between the simple genetic algorithm and a promising general-purpose stochastic search heuristic, called hyperclimbing, for optimizing over attribute product spaces (for example, the set of all binary strings of some fixed length) with rugged fitness functions. Hyperclimbing works by progressively limiting sampling to a series of nested subsets with increasing expected fitness. At each step, this heuristic searches through vast numbers of coarse partitions of the subset it ``inhabits", and identifies ones that partition this set into subsets whose expected fitness values are significantly variegated. Because hyperclimbing is sensitive, not to the local features of a search space, but to certain more global statistics, it is not susceptible to the kinds of issues that waylay local search heuristics.

The chief barrier to the wide and enthusiastic use of hyperclimbing is that it seems to scale very poorly with the number of attributes. If one heeds the seemingly high cost of applying hyperclimbing to large search spaces, this heuristic quickly looses its shine. A key conclusion of this dissertation is that this seemingly high cost is illusory. I present evidence that strongly suggests that the simple genetic algorithm can implement hyperclimbing extraordinarily efficiently.

I compare the generative fixation hypothesis with the building block hypothesis and argue that the former surpasses the latter on three counts:

  1. The former hypothesis can account for the adaptive capacity of a wider class of simple genetic algorithms. This class includes simple genetic algorithms that use uniform crossover.
  2. The former hypothesis presumes less about the distribution of fitness over the chromosome set. It does not, for example, presume the existence of a hierarchy of building blocks.
  3. The former hypothesis can successfully pass a demanding test of validity, one that involves the application of a simple genetic algorithm with and without a mechanism called clamping to large, random instances of MAXSAT.
In breaking from the building block hypothesis, the generative fixation hypothesis reverts back to two classic positions in population genetics:
  1. That fixation is the vehicle by which adaptive gains are secured.
  2. That the function of recombination is to prevent hitchhiking.
On a third matter, that of the unit of selection, the generative fixation hypothesis is at odds with the position taken by orthodox neo-darwinists, which is that the unit of selection in an evolving population is always reducible to the unit of inheritance---that the gene, in other words, is the ultimate unit of selection. In contrast, the generative fixation hypothesis holds that the unit of selection can be a small set of unlinked or weakly linked genes. This difference in positions is highly significant from a computational standpoint.

Jordan Pollack, Brandeis University (Committee Chair)
Harry Mairson, Brandeis University
Timothy Hickey, Brandeis University
Lee Altenberg, University of Hawaii at Manoa