Solving Hard Problems with Evolutionary Computation

Evolutionary Computation (EC) is a fundamentally different learning method from Deep Learning (DL) and Reinforcement Learning (RL). Those methods are primarily based on improving existing solution through gradients on performance. Thus, learning is based on exploiting what we know: known examples and small successive changes to an existing solution. In contrast, EC is based on a parallel search in a population of solutions. Its main driver is exploration, i.e. modifying a diverse set of solutions systematically and often drastically, based on what is learned from the entire space of solutions. It is common for EC methods to find solutions that are creative and surprising to human designers. Moreover, EC is naturally parallel and well-positioned to take advantage of parallel computing resources. Perhaps the greatest example of such a scale-up is the EC-Star system developed in previous research at Sentient (see Paper 3 below). Applied to the problem of evolving stock traders in 2015, EC-Star was running on 2 million CPUs, and evaluated 40 trillion candidates!

A current focus of EC research is to utilize such resources to solve very hard problems, i.e. those that (1) have a large search space, (2) have a large dimensionality, and (3) are difficult to search because they are deceptive, that is, finding good solutions requires crossing valleys in fitness, instead of simply following a gradient. With respect to large search spaces, the EC-Star system was recently shown to find solutions to the 70-multiplexer problem. The search space of this problem is 2^2^70, or 10^3.55e20 (a number so large that it would take light 95 years to traverse its 10pt printout :-). With respect to the high dimensionality, Deb et al. recently showed that EC can be used to solve problems up to a billion variables.

The new paper and the demos in this section report breakthrough technology in the third area, i.e. finding solutions in highly deceptive problems. The approach (illustrated in Demo 1.1) is based on three ideas that have been productive in EC research recently. First, multiple objectives can be used to create a search process that can get around deception. Second, it is important to focus such search to useful combinations of objectives, instead of wasting efforts on single-objective dead ends. Third, that space needs to be explored comprehensively—which can be done by favoring novel solutions within the useful boundaries. As a result, the method is likely to find solutions to problems that are large and deceptive.

Paper 1 below illustrates this technique in the domain of designing minimal sorting networks. While it is not hard to come up with a design that is correct, i.e. sorts all numbers, minimizing the number of comparators and layers is very hard. A reduction in size often requires a drastic redesign of much of the network, as is shown in Demo 1.2. Even with limited compute, evolution discovers minimal networks for many small sizes for which the optimal networks are known. The next step is to run the composite novelty method on massive compute, utilizing EC-Star and methods for running novelty search on it (as shown in Paper 2 below). Such an extension is likely to discover many more new minimal network for larger input sizes. Stay tuned!


Paper 1: Shahrzad, H., Fink, D., and Miikkulainen, R. (2018). Enhanced Optimization with Composite Objectives and Novelty Selection. In Proceedings of the 2018 Conference on Artificial Life (ALife'18, Tokyo, Japan).

Download Paper


Background paper:

Paper 2: Babak Hodjat, Hormoz Shahrzad and Risto Miikkulainen (2016). Distributed Age-Layered Novelty Search. In Proceedings of the Fifteenth International Conference on the Synthesis and Simulation of Living Systems (Alife’16, Cancun, Mexico).

Download Paper

Background paper:

Paper 3: O’Reilly, U.-M., Wagy, M., and Hodjat, B. (2013). EC-Star: A massive-scale, hub and spoke, distributed genetic programming system. In Riolo, R., Vladislavleva, E., Ritchie, M. D., and Moore, J. H., editors, Genetic Programming Theory and Practice X, pages 73-85. Springer, New York.

Download Paper