Do not attempt to cancel reshape when not all tensors are dominated#4823
Do not attempt to cancel reshape when not all tensors are dominated#4823
Conversation
|
!test --diff |
|
Review updated until commit 6c0e7f8 Description
Changes walkthrough 📝
PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
|
!test --pybench |
|
!test |
|
!test |
jjsjann123
left a comment
There was a problem hiding this comment.
stamping to unblock. I'll let you make you decision on whether to add a separate repro for tracking the follow up.
| // cancelReshapeInLoopDomains does. | ||
| // | ||
| // scheduler_tools::cancelReshapeInLoopDomains( | ||
| // largest_input, /*skip_innermost_id=*/true); |
There was a problem hiding this comment.
nitpick: if we are adding a repro. Let's put a comment here linking the repro that leads to the decision to disable the optimization.
|
!test |
…VIDIA#4823) I decided to disable the cancellation of reshape in the resize scheduler. It was originally added in NVIDIA#3679. It results in about 10% perf regression in the RoPE benchmarks http://nv/eO-. The optimization should be reenabled but rather than ad-hoc patching, I feel we should investigate fixing the root cause of the issue, which is cycles in the exact graph. Tracking issue: NVIDIA#4839
I decided to disable the cancellation of reshape in the resize scheduler. It was originally added in #3679.
It results in about 10% perf regression in the RoPE benchmarks http://nv/eO-.
The optimization should be reenabled but rather than ad-hoc patching, I feel we should investigate fixing the root cause of the issue, which is cycles in the exact graph. Tracking issue: #4839