-
Notifications
You must be signed in to change notification settings - Fork 3.8k
[DOCS] Introduction to Relay IR. #2185
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
cc @szha |
944c947 to
7c64e66
Compare
docs/dev/relay_intro.rst
Outdated
| framework developer choose the representation they are familiar with. | ||
| This does, however, have some implications on how we write passes: | ||
|
|
||
| - If you come from a data-flow background and want to handle let, keep a map of var to the expressions so you can perform lookup when encountering a var. This is a likely means a minimum change as we already need a map from expr-> transformed expression anyway. Note that this will effectively remove all the let in the program. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a likely means -> This likely means
docs/dev/relay_intro.rst
Outdated
| The Module can be viewed as a ``Map<GlobalVar, Function>``. Here GlobalVar is just an id that is used to represent the functions | ||
| in the module. ``@muladd`` and ``@myfunc`` are GlobalVars in the above example. When a CallNode is used to call another function, | ||
| the corresponding GlobalVar is stored in the op field of the CallNode. It contains a level of indirection -- we need to look up | ||
| body of the called function from the modele using the corresponding GlobalVar. In this particular case, we could also directly |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/modele/module
docs/dev/relay_intro.rst
Outdated
|
|
||
| Different data structures will impact how you might write transformations, and we need to keep that in mind. | ||
| So now, as a deep learning framework developer, you might ask, why do we need let-binding. | ||
| Yours PL friends will always tell you that let is important -- as PL is a quite established field, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/Yours/Your
docs/dev/relay_intro.rst
Outdated
| This article introduces Relay IR -- the second generation of NNVM. | ||
| We expect readers from two kinds of background -- those who have a programming language background and deep learning | ||
| framework developers who are familiar with the computational graph representation. | ||
| This article is mainly written for deep learning framework developers who are familiar with the computational graph representation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this a little repetitive?
docs/dev/relay_intro.rst
Outdated
| Build Computational Graph with Relay | ||
| ------------------------------------ | ||
| Traditional deep learning frameworks use computational graphs as their intermediate representation. | ||
| A computational graph (or data-flow graph), is a directed acyclic graph (DAG) that represent the computation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
represents
docs/dev/relay_intro.rst
Outdated
| construct a simple two-node graph. You can find that the syntax of the example is not that different from existing | ||
| computational graph IR like NNVMv1, with the only difference in terms of terminology: | ||
|
|
||
| - Existing frameworks usually uses graph and subgraph |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/uses/use
docs/dev/relay_intro.rst
Outdated
|
|
||
| Each data-flow node is a CallNode in Relay. The relay python DSL allows you to construct a data-flow quickly. | ||
| One thing we want to highlight in the above code -- is that we explicitly constructed an Add node with | ||
| both input points to ``%1``. When a deep learning framework evaluates the above program, it will compute |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
both inputs point to ?
docs/dev/relay_intro.rst
Outdated
| } | ||
|
|
||
| Let binding solves this problem, as the computation of the value happens at the let node. In both programs, | ||
| if we change ``%1 = log(%x)`` to ``let %v1 = log(%x)``, we clearly specifies the computation location to |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/specifies/specify
docs/dev/relay_intro.rst
Outdated
| -- we don’t need to worry about where to put the let when we generate the code. The dataflow form also gives more freedom | ||
| to the later passes to decide where to put the evaluation point. As a result, it might not be a bad idea to use data flow | ||
| form of the program in the initial phases of optimizations when you find it is convenient. | ||
| As a matter of fact, many optimizations in relay today are written to optimize dataflow programs. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/relay/Relay ?
|
Cool. I think the tutorial is really helpful for people to understand Relay. |
docs/dev/relay_intro.rst
Outdated
| Since program optimizations take these AST data structures and transform them, the two different structure will | ||
| affect the compiler code we are going to write. For example, if we want to detect a pattern ``add(log(x), y)``: | ||
|
|
||
| - In the data-flow form, we can first access the add node, then directly look at its first arguments to see if it is a log |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/arguments/argument
docs/dev/relay_intro.rst
Outdated
| One thing we want to highlight in the above code -- is that we explicitly constructed an Add node with | ||
| both input point to ``%1``. When a deep learning framework evaluates the above program, it will compute | ||
| the nodes in topological order, and ``%1`` will only be computed once. | ||
| While the this fact is very natural to deep learning framework builders, it is something that might |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While this fact
|
I found two more typos after another reading. |
This is an introduction material to Relay IR for developers who have a background on data-flow and computational graphs. This tutorial is a result of discussion with @jroesch @yzhliu @junrushao1994 @MarisaKirisame @slyubomirsky @joshpoll and other folks. We try to blend the views from deep learning frameworks and PL