Including mixed quant GRU op in Jarvis#15011
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/15011
Note: Links to docs will display an error until the docs builds have been completed. ❗ 2 Active SEVsThere are 2 currently active SEVs. If your PR is affected, please view them below:
❌ 4 New Failures, 3 Unrelated FailuresAs of commit b44d355 with merge base 09eac16 ( NEW FAILURES - The following jobs have failed:
BROKEN TRUNK - The following jobs failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This PR needs a
|
d34b706 to
0867b29
Compare
Summary: # Context With the goal of porting mHML on Executorch, a few operators are missing. The main focus is on improving performance for the operators used by the model. # Summary This diff includes a general and HiFi4 optimized GRU operator. Specifically, it adds both a standard GRU implementation and a version optimized for HiFi4 DSPs, ensuring better performance on supported hardware. --- #hthtemplate Reviewed By: skrtskrtfb, mcremon-meta Differential Revision: D81703253
Summary: # Context With the goal of porting mHML on Executorch, a few operators are missing. The main focus is on improving performance for the operators used by the model. # Summary This diff includes a general and HiFi4 optimized GRU operator. Specifically, it adds both a standard GRU implementation and a version optimized for HiFi4 DSPs, ensuring better performance on supported hardware. --- #hthtemplate Reviewed By: skrtskrtfb, mcremon-meta Differential Revision: D81703253
0867b29 to
986d302
Compare
Summary: # Context With the goal of porting mHML on Executorch, a few operators are missing. The main focus is on improving performance for the operators used by the model. # Summary This diff includes a general and HiFi4 optimized GRU operator. Specifically, it adds both a standard GRU implementation and a version optimized for HiFi4 DSPs, ensuring better performance on supported hardware. --- #hthtemplate Reviewed By: skrtskrtfb, mcremon-meta Differential Revision: D81703253
986d302 to
5467c13
Compare
Summary: # Context With the goal of porting mHML on Executorch, a few operators are missing. The main focus is on improving performance for the operators used by the model. # Summary This diff includes a general and HiFi4 optimized GRU operator. Specifically, it adds both a standard GRU implementation and a version optimized for HiFi4 DSPs, ensuring better performance on supported hardware. --- #hthtemplate Reviewed By: skrtskrtfb, mcremon-meta Differential Revision: D81703253
5467c13 to
b44d355
Compare
Differential Revision: D81703253 Pull Request resolved: pytorch#15011
Summary
This diff includes a general and HiFi4 optimized GRU operator.
Specifically, it adds both a standard GRU implementation and a version optimized for HiFi4 DSPs, ensuring better performance on supported hardware.
#hthtemplate
Reviewed By: mcremon-meta
Differential Revision: D81703253