-
Notifications
You must be signed in to change notification settings - Fork 3.8k
[RELAY][BACKEND] CompileEngine refactor. #2059
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
cc @jroesch @eqy @MarisaKirisame @ZihengJiang please review |
|
I will review first thing in the morning. |
| return graph_json, mod, params | ||
|
|
||
|
|
||
| class GraphExecutor(_interpreter.Executor): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why should the executor code be located in build_module?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mainly because the logic of graph build is here, and we can reuse the optimize function
|
Overall looks good 👍, I think we should still lift all local functions into global definitions before optimizing, it is a good invariant and allows us to do global transformations (in follow-up passes) without having to rewrite the functions which depend on them. If we are worried about performance we can re-perform inning at the end. @MarisaKirisame could you take a look at this? it effect us. |
|
re global function lifting. I did not do them in the fuseops to make sure that the current API do not depend heavily on module, once we introduce the entry_func field into module I think we will be more comfortable using modules, that can come as a followup |
|
@jroesch how do we do closure? If all function are global I dont think we even have the concept of closure, as we have no partial application. |
| if tuple_arg_count: | ||
| assert len(call.args) == 1 | ||
| assert isinstance(inputs, tuple) | ||
| inputs = list(inputs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
inputs = list(inputs[0])?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch, fixed by followup PR
|
Overall pretty good, except that line which I think is incorrect. |
|
@jroesch @MarisaKirisame please take another look, I have rebased and add support for tuple input(concat) |
MarisaKirisame
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
This PR attempts to clean up the interpreter and graph runtime compile by making interface consistent, and organize them into backend modules. It also added support of a CompileEngine that caches previously compiled the code and can be used as the backbone of AOT AND JIT. After this PR, I feel most of the compiler scaffolding is done for the graph compiler support.
Notes on some observations I made during the refactor