One of the neater implementations/uses of genetic algorithms that I have seen was the design of a spacecraft antenna.<p>This is what the GA came with up for the design given the constraints: <a href="https://en.wikipedia.org/wiki/Genetic_algorithm#/media/File%3ASt_5-xband-antenna.jpg" rel="nofollow">https://en.wikipedia.org/wiki/Genetic_algorithm#/media/File%...</a><p>And this paper describes the process: <a href="http://ti.arc.nasa.gov/m/pub-archive/1244h/1244%20(Hornby).pdf" rel="nofollow">http://ti.arc.nasa.gov/m/pub-archive/1244h/1244%20(Hornby).p...</a>
Took a GA class during my undergrad years -- actually went to a neighboring university to do it because mine didn't offer it -- and it informed a research project I got a tiny grant for. One of the things my prof noted was that many people looooved GAs as a research topic because, at least at the time, a good chunk of related work was in coming up with ideas for fitness functions and examining them and that's a vein a reasonably creative person could mine for publishing for a long time.
When I was in college I was mystified by genetic algorithms, without knowing much about them. After taking 2 subjects on the matter and reading some books, I came to the conclusion that apart from being inherently inefficient (that's what you apply them when you have no alternative), they are actually outperformed by hill climbing (which can be seen as a particular case of the former if population = 1). Also, the crossover operator seems to make more harm than good, and it's not fully understood it's usefulness in nature, although there are some theories (this last point is taken from Pedro Domingos book).
Out of curiosity, does anyone know of examples where genetic algorithms are the "right choice"?<p>I thought they were really nifty when I first heard of them, but thinking of them as an optimization procedure, they don't really stand up in my experience to basic gradient descent methods. Sure, you can e.g. train a neural network with genetic algorithms, but why would you?<p>I'd love to be proven wrong though :)
This is probably extremely cynical, but does floydhub have a vested interest in promoting genetic algorithms because they can take long times to train, especially with neural networks hyper parameters as the search space. These kind of beefy, complex training processes would be fantastic for its business.
Hmm, so what is the theory of the mutation function? Is it just determined ad-hoc from looking the problem and throwing some standard examples at it?<p>It seems like you could implement effectively any function with a complicated enough combination of fitness, selection and mutation functions but without a theory of which to use, progress would be a bit hard.
I've been working on a Genetic Algorithm/Evolutionary Computing framework in Scala, using network-parallelism to solve optimization problems fast. If you're interested, check it out at <a href="https://github.com/evvo-labs/evvo" rel="nofollow">https://github.com/evvo-labs/evvo</a>
An interesting case is where a human does the selection; this was pretty well covered in Dawkin's 'The Blind Watchmaker' [0] with his program for generating biomorphs. Some nice speculation in there too:<p>'Dawkins speculated that the unnatural selection role played by the user in this program could be replaced by a more natural agent if, for example, colourful biomorphs could be selected by butterflies or other insects, via a touch-sensitive display set up in a garden.' (from the wikipedia article).<p>[0] <a href="https://en.wikipedia.org/wiki/The_Blind_Watchmaker" rel="nofollow">https://en.wikipedia.org/wiki/The_Blind_Watchmaker</a>
A lot of the time there are better solutions that a GA, but something I've always liked about them is how easy they are to understand, even for AI noobs.<p>I guess it's because of the similarities to the evolution of life, but also because they're just quite simple.
GAs are a lot of fun, but a lot of the time they are very time-consuming in the case of fairly straightforward optimisation problems. Other iterative searches like simulated annealing or just plain ol' dumb hill climbing is a lot faster most of the time.