You can think of neural networks being just a bunch of logistic regression, that for each node in the next layer is a new logistic regression, so in order to compute the last layer, you must compute the others before, they name this feedfoward propagation.

But in the same way you update the weights of a logistic regression, you do this multiple times for each node “logistic regression” that your neural network has. In order for you to update the weights, you start with what the output should be, and then update the previous layer, and keep doing this until you reach the input weights, this is the .

I’m not an expert, but that’s how i understood it.



Source link
thanks you RSS link
( https://www.reddit.com/r//comments/9ddg3y/d_what_do_you_think_is_the_best_way_to_/)

LEAVE A REPLY

Please enter your comment!
Please enter your name here