Skip to content

Conversation

@SiriusNEO
Copy link
Contributor

@SiriusNEO SiriusNEO commented Apr 5, 2023

Prior to this PR, we first have a pass named SimplifyNormInference (#14221), which is used to decompose batch_norm into simple operators for optimization during inference. And there are some follow-up changes:

So this PR polishes the changes in #14282 mainly in the following aspects:

  • Use one mutator (instead of two) to visit and decompose ops.
  • Separate inference and training by registering two passes (Adviced by TQ).
  • Shorten the name (DecomposeCompositeOps -> DecomposeOps)
  • Func Pass => Module Pass with func_name to specify the function we want to apply the pass (Or let it be None if we want to simplify all functions). This is because sometimes we only want to decompose operators in a specified function.

Now the code is clear and easy to read. Now we use if-else if-else if ... pattern to recognize the op we want to decompose. In the future maybe we can introduce a map or register the decomposition policy in the op attribute, just like what we do in the LegalizeOps pass.

@tvm-bot
Copy link
Collaborator

tvm-bot commented Apr 5, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

Generated by tvm-bot

@SiriusNEO
Copy link
Contributor Author

cc @tqchen @sunggg @Hzfengsy

});
}

Expr SimplifyLayerNorm(const Call& call) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't need to decompose LayerNorm since it can be implemented more efficiently as a single op

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any complication for fusion if we have a composite ops in general? (not specifically about LayerNorm). i.e., Is there a case where we cannot fuse the composite op but we can when we decompose?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah here we simplify LayerNorm for now just because it is easy to get its gradient. If it is more efficient as a single op, maybe in the future we can remove this from the pass after we finish the gradient function of the whole LayerNorm

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or maybe we can add some arguments for the pass to let the user specify what operators he want to simplify?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As decomposing LayerNorm is used for training and may influence the inference perf, let's only decompose it in training mode

Copy link
Member

@vinx13 vinx13 Apr 6, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any complication for fusion if we have a composite ops in general?

Yes, we need to be aware of the implementation when setting op type (kOpaque or kInjective). This will cause different fusion results.

Copy link
Contributor

@sunggg sunggg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thank you for the improvement.
Just a few comments.

});
}

Expr SimplifyLayerNorm(const Call& call) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As decomposing LayerNorm is used for training and may influence the inference perf, let's only decompose it in training mode

@SiriusNEO SiriusNEO force-pushed the unity-dev/2023-04-02-fix-decompose branch from 6bfccf8 to a7dca77 Compare April 6, 2023 07:07
@tqchen
Copy link
Member

tqchen commented Apr 6, 2023

@sunggg @vinx13 please take another look

@vinx13 vinx13 merged commit bb479e6 into apache:unity Apr 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants