-
Notifications
You must be signed in to change notification settings - Fork 3.8k
[RELAY][FRONTEND]Onnx to relay frontend #2302
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@zhreshold I had some patches on top of this change. you can apply this. |
fea4c06 to
c045be0
Compare
|
@srkreddy1238 @nishi-t @Huyuwei @hlu1 please help review this PR |
srkreddy1238
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Initial review. Will revisit after the CI passed.
python/tvm/relay/frontend/onnx.py
Outdated
| 'dilations': ('dilation', (0, 0)), | ||
| 'pads': ('padding', (0, 0), revert_caffe2_pad), | ||
| 'group': ('groups', 1)}, | ||
| custom_check=dimension_constraint())(inputs, attr, params) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we shouldn't pass 3 inputs (if bias available).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will strip to first 2 inputs
python/tvm/relay/frontend/onnx.py
Outdated
| }, | ||
| disables=['output_shape'], | ||
| extras={'use_bias': len(inputs) == 3}, | ||
| custom_check=dimension_constraint())(inputs, attr, params) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above.
| return AttrCvt('upsampling')(inputs, attr) | ||
|
|
||
|
|
||
| class Shape(OnnxOpConverter): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The shape op is a workaround in NNVM. We may need to revisit in relay.
Can be disabled here and handled outside this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will disable for now
python/tvm/relay/frontend/onnx.py
Outdated
| 'LRN': LRN.get_converter(opset), | ||
|
|
||
| # defs/reduction | ||
| #'ReduceMax': AttrCvt('max', transforms={'axes': 'axis'}), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did you forget to delete above comment line?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, will clean the comments
|
@srkreddy1238 @nishi-t Can you guys help review again? |
jroesch
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added a bunch of comments on the PR. Mostly issues with documentation, otherwise if tests pass looks good.
python/tvm/relay/frontend/common.py
Outdated
| return default | ||
|
|
||
| def get_relay_op(op_name): | ||
| """Get the callable function from relay based on opname: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| """Get the callable function from relay based on opname: | |
| """Get the callable function from Relay based on operator name. |
python/tvm/relay/frontend/common.py
Outdated
| Parameters | ||
| ---------- | ||
| op_name : str | ||
| The relay operator name. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| The relay operator name. | |
| The Relay operator name. |
python/tvm/relay/frontend/common.py
Outdated
| return out_shapes | ||
|
|
||
| def infer_channels(inputs, transpose=False): | ||
| """A hack for getting 'channels' or 'units' since caffe2 don't provide |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| """A hack for getting 'channels' or 'units' since caffe2 don't provide | |
| """A hack for getting 'channels' or 'units' since caffe2 does not provide |
python/tvm/relay/frontend/onnx.py
Outdated
| @@ -0,0 +1,1077 @@ | |||
| # pylint: disable=invalid-name, import-self, len-as-condition, unused-argument, too-many-lines | |||
| """ONNX: Open Neural Network Exchange frontend for relay.""" | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| """ONNX: Open Neural Network Exchange frontend for relay.""" | |
| """ONNX: Open Neural Network Exchange frontend for Relay.""" |
python/tvm/relay/frontend/onnx.py
Outdated
| return _impl | ||
|
|
||
| def revert_caffe2_pad(pads): | ||
| """Caffe2 require two times the normal padding.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| """Caffe2 require two times the normal padding.""" | |
| """Caffe2 requires two times the normal padding.""" |
python/tvm/relay/frontend/onnx.py
Outdated
| inputs, | ||
| attrs, | ||
| opset): | ||
| """Convert from onnx operator to relay operator. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| """Convert from onnx operator to relay operator. | |
| """Convert ONNX operator into a Relay operator. |
python/tvm/relay/frontend/onnx.py
Outdated
| def from_onnx(model, | ||
| shape=None, | ||
| dtype="float32"): | ||
| """Convert from ONNX"s model into compatible relay Function. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| """Convert from ONNX"s model into compatible relay Function. | |
| """Convert a ONNX model into an equivalent Relay function. |
python/tvm/relay/frontend/onnx.py
Outdated
| shape=None, | ||
| dtype="float32"): | ||
| """Convert from ONNX"s model into compatible relay Function. | ||
| Onnx graph is a python protobuf object. The companion parameters will be handled automatically. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| Onnx graph is a python protobuf object. The companion parameters will be handled automatically. | |
| ONNX graphs are represented as a Python Protobuf object. The companion parameters will be handled automatically. |
tutorials/relay/from_onnx.py
Outdated
|
|
||
| This article is an introductory tutorial to deploy ONNX models with Relay. | ||
|
|
||
| For us to begin with, onnx module is required to be installed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| For us to begin with, onnx module is required to be installed. | |
| For us to begin with, ONNX package must be installed. |
| with relay.build_config(opt_level=1): | ||
| graph, lib, params = relay.build(sym, target, params=params) | ||
|
|
||
| ###################################################################### |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we maybe just show how to use the create_executor interface in the tutorial as well? its much simpler then building a graph runtime by hand
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jroesch Can you please advice how to elegantly evaluate executor created with an unordered dict of weights?
exec = relay.build_module.create_executor('graph', mod, tvm.cpu(0), 'llvm')
# params is a dict of {'name', tvm.ndarray}
tvm_output = exec.evaluate(mode)(x, *params.values()).asnumpy() # wrong orderHow do I feed the executor with correct weights without changing the params dict to odict?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zhreshold I realize the current API does not actually support keyword argument style for parameters.
I'm going to update it with a PR right now, it should work like this:
exec = relay.build_module.create_executor('graph', mod, tvm.cpu(0), 'llvm')
tvm_output = exec.evaluate(mode)(x, **params).asnumpy()
Does that seem elegant enough?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that would perfectly solve my current problem
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will update once the PR get merged.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR is merged.
Co-authored-by: Siju Samuel <siju.samuel@huawei.com>
Co-authored-by: Siju Samuel <siju.samuel@huawei.com>
python/tvm/relay/frontend/onnx.py
Outdated
| def _impl_v1(cls, inputs, attr, params): | ||
| # Result of this operator is prominently used by reshape operator. | ||
| # Just pass the input as it is so that reshape_like can be used there. | ||
| print("Shape: Differently implemented in relay as a bypass (dummy operator)") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use logging.warning instead of print.
|
@srkreddy1238 @jroesch @nishi-t please help review this PR again |
|
@zhreshold Should we put the tests under https://github.com/dmlc/tvm/tree/master/tests/python/frontend, and add it to CI https://github.com/dmlc/tvm/blob/master/tests/scripts/task_python_frontend.sh/#L30-L34? |
|
@Huyuwei Seems like there is |
|
@zhreshold tests/python/frontend is the right place to hold frontend tests, tests/python/relay/frontend is duplicated and should be removed. In fact, tests/python/relay/frontend is not running in the CI |
|
@Huyuwei I'll remove the duplicate tests in a separate PR. |
|
ok, working on that |
jroesch
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 👍
ONNX -> relay frontend migration.
This is co-authered by @siju-samuel with initial attempt #2245
relay/frontendcommonmodule.