Previous | Next --- Slide 23 of 79
Back to Lecture Thumbnails
mkarra

Given a loss, one way to go about adjusting the weights would be to randomly pick a new value for the weights and observe changes to the loss function. However, this would be extremely inefficient, which is why we use algorithms such as gradient descent (following slides) to adjust network parameters.

Please log in to leave a comment.