How would results from previous slides change if we expanded neural nets to be cyclical and take variable forward steps to evaluate?
xhe17
@gleb, not sure if I understand you correctly, but generally there shouldn't be any cyclical structures in DNN? Otherwise back propagation would not be well-defined?
How would results from previous slides change if we expanded neural nets to be cyclical and take variable forward steps to evaluate?