Previous | Next --- Slide 52 of 62
Back to Lecture Thumbnails
gleb

How would results from previous slides change if we expanded neural nets to be cyclical and take variable forward steps to evaluate?

xhe17

@gleb, not sure if I understand you correctly, but generally there shouldn't be any cyclical structures in DNN? Otherwise back propagation would not be well-defined?

Please log in to leave a comment.