Backpropagation

The error correction mechanism in neural networks.

Published May 8, 2024 ET
  • Back-Propagation is the phenomenon of a neural network-based function correcting itself
  • It looks at the output that its analysis function has calculated and compares it to the actual output
  • It then uses the differences to adjust the weights of its analysis function
  • Those differences are known as a "loss function"
  • Summarily, Back-Propagation explains a process for error correction