Learn the Mini-Batch Gradient Descent algorithm, and some of the key advantages and disadvantages of using this technique. Examples done in Python.

Learn the Stochastic Gradient Descent algorithm, and some of the key advantages and disadvantages of using this technique. Examples done in Python.

Gain insight, and an intuition, for how Neural Networks learn. A worked example, covering each step in the learning procedure, is included.

Learn the Batch Gradient Descent algorithm, and some of the key advantages and disadvantages of using this technique. Examples done in Python.

Cross Entropy serves as a loss function, in the context of machine learning classification problems. Learn all about the Cross Entropy Loss here.

This post will cover the Backpropagation algorithm used to train supervised Neural Networks. Motivations and mathematical details will be covered.