Previous | Next --- Slide 60 of 73
Back to Lecture Thumbnails
fizzbuzz

It seems like while it's very difficult to build a compiler that auto-parallelizes code in the general case, it is possible to do with enough domain assumptions.

I wonder if this pattern holds true about other properties we might want compilers to help us with in our programs.

gpu

This feels very much like a meta-learning problem to me: can a compiler learn the appropriate parallelization given a set of domain-specific (i.e. task-specific) assumptions. Looking into some of the literature on this, it looks like there has been some research in this area. For those interested in taking a dive down the rabbit hole, this literature review seems promising: https://link.springer.com/article/10.1007/s00607-018-0614-9.

In general, it would be immensely powerful to "collapse" domain-specific languages into a single/few meta-optimizing languages.

ishangaur

@gpu, does that article give any insights to what the features of such a language might be? I would assume it would look like some kind of functional language, maybe even with dynamic typing. But are there known things we might want from such a system?

haiyuem

Also curious about what @ishangur was asking. Wondering how much a DSL might know about the program specifications and how "detailed" such system needs to be.

Please log in to leave a comment.