+**Exploding Gradient** describes a challenge during the training of [Neural Networks](/wiki/neural_networks) where the error gradient grows extremely large. This surge causes unstable updates to model weights and the [Learning Rate](/wiki/learning_rate), preventing effective convergence and optimization.
+## See also
+- [Vanishing Gradient](/wiki/vanishing_gradient)
+- [Gradient Descent](/wiki/gradient_descent)
+- [Backpropagation](/wiki/backpropagation)
... 1 more lines