Add 16A8W linear ops support and test#13658
Add 16A8W linear ops support and test#13658facebook-github-bot merged 5 commits intogh/Ninja91/3/basefrom
Conversation
- Adds linear ops test using the 16A8W config in INT16 profile. - Adds support in view ops validation for INT16 Dtype. - Validated with TOSA pipeline test. - Checked earlier marked flaky tests no longer flaky and remove markers. Note: Not verified with tosa reference model run. Differential Revision: [D80308822](https://our.internmc.facebook.com/intern/diff/D80308822/) [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/13658
Note: Links to docs will display an error until the docs builds have been completed. ❌ 3 New FailuresAs of commit 019d222 with merge base 9053089 ( NEW FAILURES - The following jobs have failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
- Adds linear ops test using the 16A8W config in INT16 profile. - Adds support in view ops validation for INT16 Dtype. - Validated with TOSA pipeline test. - Checked earlier marked flaky tests no longer flaky and remove markers. Note: Not verified with tosa reference model run. Differential Revision: [D80308822](https://our.internmc.facebook.com/intern/diff/D80308822/) ghstack-source-id: 305494502 Pull Request resolved: #13658
|
This pull request was exported from Phabricator. Differential Revision: D80308822 |
This PR needs a
|
|
Updated version of #13448. Closing the former. |
|
@per @digantdesai some of CI is failing but it doesn't appear to be related to code change here. |
- Adds linear ops test using the 16A8W config in INT16 profile. - Adds support in view ops validation for INT16 Dtype. - Validated with TOSA pipeline test. - Checked earlier marked flaky tests no longer flaky and remove markers. Note: Not verified with tosa reference model run. Differential Revision: [D80308822](https://our.internmc.facebook.com/intern/diff/D80308822/) [ghstack-poisoned]
|
This pull request was exported from Phabricator. Differential Revision: D80308822 |
per
left a comment
There was a problem hiding this comment.
Nice! Two small suggestions, otherwise LGTM.
| validate_valid_dtype( | ||
| self.target, | ||
| [inputs[0], output], | ||
| [ts.DType.INT8, ts.DType.INT32, ts.DType.FP32, ts.DType.BOOL], |
|
|
- Adds linear ops test using the 16A8W config in INT16 profile. - Adds support in view ops validation for INT16 Dtype. - Validated with TOSA pipeline test. - Checked earlier marked flaky tests no longer flaky and remove markers. Note: Not verified with tosa reference model run. Differential Revision: [D80308822](https://our.internmc.facebook.com/intern/diff/D80308822/) [ghstack-poisoned]
|
This pull request was exported from Phabricator. Differential Revision: D80308822 |
- Adds linear ops test using the 16A8W config in INT16 profile. - Adds support in view ops validation for INT16 Dtype. - Validated with TOSA pipeline test. - Checked earlier marked flaky tests no longer flaky and remove markers. Note: Not verified with tosa reference model run. Differential Revision: [D80308822](https://our.internmc.facebook.com/intern/diff/D80308822/) [ghstack-poisoned]
Pull Request resolved: #13658 - Adds linear ops test using the 16A8W config in INT16 profile. - Adds support in view ops validation for INT16 Dtype. - Validated with TOSA pipeline test. - Checked earlier marked flaky tests no longer flaky and remove markers. Note: Not verified with tosa reference model run. ghstack-source-id: 305893128 Differential Revision: [D80308822](https://our.internmc.facebook.com/intern/diff/D80308822/)
|
This pull request was exported from Phabricator. Differential Revision: D80308822 |
- Adds linear ops test using the 16A8W config in INT16 profile. - Adds support in view ops validation for INT16 Dtype. - Validated with TOSA pipeline test. - Checked earlier marked flaky tests no longer flaky and remove markers. Note: Not verified with tosa reference model run. Differential Revision: [D80308822](https://our.internmc.facebook.com/intern/diff/D80308822/) [ghstack-poisoned]
|
This pull request was exported from Phabricator. Differential Revision: D80308822 |
8d7c687
into
gh/Ninja91/3/base
|
@digantdesai Were was this merged? Can't see it on main... Getting a bit confused by the ghstack stuff 😕 |
|
@Ninja91 ? |
This diff was merged and new PR was created for main branch here: #13754 I see that there are some tests failing. Looks like the change to use support_integer() causes FP tests fail as some ops are still INT. What would be your suggestion? |
|
this didn't merge with main, can you push this to main manually? |
This PR was created by the merge bot to help merge the original PR into the main branch. ghstack PR number: #13658 by @Ninja91 ^ Please use this as the source of truth for the PR details, comments, and reviews ghstack PR base: https://github.com/pytorch/executorch/tree/gh/Ninja91/3/base ghstack PR head: https://github.com/pytorch/executorch/tree/gh/Ninja91/3/head Merge bot PR base: https://github.com/pytorch/executorch/tree/gh/Ninja91/1/orig Merge bot PR head: https://github.com/pytorch/executorch/tree/gh/Ninja91/3/orig @diff-train-skip-merge cc @digantdesai @freddan80 @per @zingo @oscarandersson8218 --------- Co-authored-by: Nitin Jain <jainnitin@meta.com>
Stack from ghstack (oldest at bottom):
Note: Not verified with tosa reference model run.
Differential Revision: D80308822