This is an expensive algorithm because the while loop is sequential and we train on huge data sets where have to pass over each epoch and perform the large computation(evaluate_loss_gradient).
Please log in to leave a comment.
This is an expensive algorithm because the while loop is sequential and we train on huge data sets where have to pass over each epoch and perform the large computation(evaluate_loss_gradient).