👉 SGD stands for Stochastic Gradient Descent. It is a type of optimization algorithm used to minimize a cost function by iteratively updating parameters of an artificial neural network, called a deep learning model. This process involves updating the weights and biases of the model through a series of gradient descent steps, which are essentially calculating the derivative of the cost function with respect to each parameter.