Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 62750

Hi r/machinelearning , any chance with a hand with a Neural Networks problem?

$
0
0

Basically, i'm doing a university course and the backpropogation algorithm has come up. I understand it and how it works, except for the following;

In this video; http://www.youtube.com/watch?v=p1-FiWjThs8&feature=relmfu , it is stated that I need the gradient of error (and therefore the inverse function of the node) for the output node before continuing. The bit I mean is at 6:30.

However, in the example I have, the output node is a summation node, and I was wondering how I would go about finding the error gradient in this case? I can't get the derivative of the activation function (A sum), can I?

Thanks in advance!

submitted by AryanHonesty
[link][10 comments]

Viewing all articles
Browse latest Browse all 62750

Trending Articles