Previous | Next --- Slide 27 of 79
Back to Lecture Thumbnails
endofmoore

This is an expensive algorithm because the while loop is sequential and we train on huge data sets where have to pass over each epoch and perform the large computation(evaluate_loss_gradient).

Please log in to leave a comment.