Previous | Next --- Slide 48 of 63
Back to Lecture Thumbnails
itoen

The key idea is that the main data type you use in a data-parallel model are entire collections. Similar to the last point about numpy, this reminds me of work in machine learning frameworks like PyTorch, where it's extremely difficult to access or modify individual elements, and all functions can only be applied to entire tensors.

mhchin

I think having this concept of data-parallel model in mind is very helpful. I have been working on parallelizing tasks in CUDA for my research, and I had a hard time putting the code into a parallel computation. It took me a lot of time before I realized that I had to revisit the algorithm itself and have parallizable one.

Please log in to leave a comment.