Hello!
I posted a few days ago about the idea of using evolutionary algorithms to train the dynamics of some generic network of nodes (this included activation function, synaptic update, connection establisher/remover, and an encoder/decoder). This original post is available here. There was a lot of skepticism but also encouragement, thank you for both!
I have brought my meta-learning algorithm into a working state. It is still far from done, but its first results show that it is possible to make this concept work. I call it "HyperNet", since it is essentially optimizing only hyperparameters (functions, in this case).
The first experiment I ran it on was XOR. It first kept on getting stuck in a local minimum where it would just output the same value for every input, but after some bug fixes and tweaks I got it out of there. I just now (5 minutes before writing this) got it to ascend into the next local minimum, where all but one of the outputs are correct. This was after only 100 generations, and it was still steadily improving before I stopped it (it was taking quite a while ;) ). I will dedicate more time to the evolution of proper learning rules after submitting this post.
I just had to post about this, it's so exciting!
I will upload the code when it is ready (when it is less buggy).
[link][4 comments]