back-propagation
(Or "backpropagation") A learning algorithm that modifies a feed-forward neural network so as to minimise a continuous "error function" or "objective function." Back-propagation is a "gradient descent" method of training in that it uses gradient information to modify the network weights to decrease the output error. Other gradient based methods from numerical analysis can be used to train networks more efficiently.
Back-propagation uses a mathematical trick when the network is simulated on a digital computer, yielding, in just two traversals of the network (once forward, and once back), both the difference between the desired and actual output, and the derivatives of this difference with respect to the connection weights.Last updated: 2020-05-17
Nearby terms:
backplane ♦ backport ♦ back-propagation ♦ back quote ♦ backronym ♦ backside cache
Try this search on Wikipedia, Wiktionary, Google, OneLook.
Loading