When we worked through this example in breakout rooms during lecture, my group thought back to the data parallel programming abstraction discussed earlier in this course and how using functions like map to apply composed operations on data could enable different optimizations for better performance. Going back to lecture 3, I think this slide perfectly encapsulates that idea, and I think there are some great parallels (no pun intended) between the improved performance we see here and the ideas of data parallel programming from earlier in the quarter.
pintos
Having improved temporal locality improves performance if the program is bandwidth bound, but it does not help performance if the program is compute bound.
When we worked through this example in breakout rooms during lecture, my group thought back to the data parallel programming abstraction discussed earlier in this course and how using functions like
map
to apply composed operations on data could enable different optimizations for better performance. Going back to lecture 3, I think this slide perfectly encapsulates that idea, and I think there are some great parallels (no pun intended) between the improved performance we see here and the ideas of data parallel programming from earlier in the quarter.