Optimization Heuristics, part II: Learning from Nature

Last time I have been talking about what happens when we have no idea how to solve a particular problem and I concluded that in that case, it might be useful to turn our attention to the nature’s way of solving problems. Today, I want to talk about the ideas we can extract from nature.

So, what does nature offer us? The most striking example is, of course, evolution. So it may seem like a good idea to simulate evolution. That usually means that you have a population of objects in your search space and impose some kind of selection pressure on those objects, making it hard for bad solutions to procreate. It is also usual to add some kind of mutation, since without mutation, you are not getting anywhere.

This sounds simple enough. But there is a catch: you might have noticed that, for current biological landscape to evolve, evolution was not particularly acting in hurry. The point of evolution is that it works at all, not that it is efficient. But nevertheless, evolutionary algorithms have been around for forty years and they do work really well in practice. There is also a notorious example of nozzle shape optimization, which is one of the main selling points for artificial evolution: no sane engineer could imagine such nozzle shapes, but search heuristics have no idea of “sane”, “intuitive” or similar, which makes them exploit all aspects of the problem, which is both good and bad: good since they can go ways no one would go and bad since they will explore a lot of places in the search space that are not necessarily interesting. Think about the configuration of the mammalian eye and especially the blind spot: No mutation can undo this now, the future humans (if they will not turn into cyborgs) are stuck with it, since the mutation distance between the current eye design and something more optimal is too huge.

Another great hit is simulated annealing which is not in the hot trends anymore, but the idea is still cool: The search process is modeled after thermodynamic processes in the lattice structure of cooling metals: with sinking temperature, the probability of changing the solution is sinking and the solution stabilizes like a metal slab is getting solid and reaches some thermodynamically optimal point in the process of cooling. This approach has been thoroughly analyzed and has several cool properties: for some rather hard problems, it is expected to deliver optimal solutions. The key word is “expected”: since the process is stochastic, no guarantees can be made. Actually, the property is rather weak. But it is nevertheless cool.

A lot of other heuristics have been developed: DNA computing, particle swarm optimization, ant colony optimization, and a whole lot of other techniques that borrow ideas from processes occurring in nature. But they all come with a price, because all of them have an ugly side about which I will talk in the next post.

Leave a Reply

Your email address will not be published.

Time limit is exhausted. Please reload CAPTCHA.

This site uses Akismet to reduce spam. Learn how your comment data is processed.