Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 62728

Gradient Based Learning Algorithms vs Global Optimization Learning Algorithms for Neural Networks

$
0
0

First, is posting questions like this accepted in this sub? I usually post these type of questions to Stack Exchange sites, but the few sites which accept Neural Net questions (StackOverflow, CrossValidated,and CogSci) are not specifically for machine learning and hence often get down voted.

Anyway my post:

Neural Networks are usually trained using a gradient based learning algorithm, such as the back propagation algorithm or some variant of it, but can you use global optimization algorithms, such as the Genetic Algorithm, Nelder-Mead Polytope Algorithm, and Particle Swarm Optimisation to train the network?

Since training a neural network all boils down to minimizing a multi-variable cost function, I would assume that it should be easy to do this using a global optimization method, but I have tried to do it myself and I'm getting very bad results.

The GA algorithm reduces my cost function from around 250 (when it is input with random synaptic weights), to only around 170. The Nelder-Mead Algorithm would apparently take years, and I have not yet tried PSO as there is no inbuilt MATLAB function for it.

So is it just accepted that gradient based algorithms are the most suitable for training a Neural Network? If so can someone please point me towards a source of this information? This will be very helpful as I could reference the source in my project to explain why I gave up trying to use global optimization methods to train the network.

Thanks!

submitted by sealturkey
[link][9 comments]

Viewing all articles
Browse latest Browse all 62728

Trending Articles