ELF™: An Evolutionary Learning Framework

ELF™, the Evolutionary Learning Framework developed at FINAL™ Neural Architecture Laboratories, is a crossover-less reinforced optimisation framework based on genetic algorithms and simulated evolution. ELF’s primary purpose is to minimise expert bias and deliver novel solutions through learning algorithms that gradually and randomly evolve a population.

Our design choices reflect this aim.

Thanks to ELF’s crossover-less nature no expert-driven genotype layout and genotype-phenotype mapping are needed. The fittest set of creatures are selected for reproduction and random mutations are introduced to all aspects of the thus spawned offspring, resulting in changes at the common genotype-phenotype level.

In order to overcome any intergeneration local minima, multiple generations are spawned in each cycle. This way any early mutations that deteriorate the fitness of a specific generation do not lead to the elimination of the entire line from the creature pool. The possibility of fitness improvement in later generations is left open.

The fitness of creatures is evaluated relative to one another. This approach enables a large array of dissimilar training scenarios to be applied within the same training round. In case of creatures with identical fitness (error) values, the least complex of them is deemed to be the fittest.

An example

In the following example each creature is a simple computer. Each such computer has:

  • instructions (I);
  • the sequence (S) of these instructions;
  • a memory (M) consisting of initial values;
  • connections (C) between the memory and each instruction.

Creatures use a common set of 14 instructions that include NOOP. Fitness is calculated as the LMS error between the Actual Output and the Expected Output for each creature and training scenario. Complexity is calculated based on (I) + (S) + (M) + (C)

The two graphs below show the evolution of the error level for the fittest creature and the respective complexity in the case of two ELF™ test runs. The task in these examples was the same, to simply copy the input of the creature to its output. It is interesting to note the gradual increase in complexity with a reduction following the jumps in fitness – most likely due to the elimination of superfluous NOOPs and unused memory elements.

A graph of the best error and related complexity in a test run of the Evolutionary Learning Framework developed at FINAL[tm] Labs
A graph of the best error and related complexity in another ELF™ test run