-
Notifications
You must be signed in to change notification settings - Fork 3.8k
[Relay][Frontend] Add MXNet test example for relay #2316
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
I already did type correction in import tvm
from tvm import relay
x = relay.var("x", shape=(1, 64, 56, 56))
bias = relay.var("bias")
bias2 = relay.var("bias")
weight = relay.var("weight")
y = relay.nn.conv2d(x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1))
y = relay.nn.bias_add(y, bias)
y = relay.nn.bias_add(y, bias2)
y = relay.Function(relay.ir_pass.free_vars(y), y)
a = relay.ir_pass.infer_type(y)
a = relay.ir_pass.canonicalize_ops(a)
print(a.astext()) |
|
If you add another stmt in between these two bias_add, it will fail. import tvm
from tvm import relay
x = relay.var("x", shape=(1, 64, 56, 56))
bias = relay.var("bias")
bias2 = relay.var("bias")
weight = relay.var("weight")
weight2 = relay.var("weight")
y = relay.nn.conv2d(x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1))
y = relay.nn.bias_add(y, bias)
y = relay.nn.conv2d(y, weight2, channels=128, kernel_size=(3, 3), padding=(1, 1))
y = relay.nn.bias_add(y, bias2)
y = relay.Function(relay.ir_pass.free_vars(y), y)
a = relay.ir_pass.infer_type(y)
a = relay.ir_pass.canonicalize_ops(a)
print(a.astext()) |
|
Try this one line change! diff --git a/src/relay/pass/canonicalize_ops.cc b/src/relay/pass/canonicalize_ops.cc
index 77cd59e2..4482dc39 100644
--- a/src/relay/pass/canonicalize_ops.cc
+++ b/src/relay/pass/canonicalize_ops.cc
@@ -22,7 +22,7 @@ class BiasAddSimplifier : public ExprMutator {
CHECK_EQ(call->args.size(), 2);
const BiasAddAttrs* param = call->attrs.as<BiasAddAttrs>();
- auto ttype = call->args[0]->type_as<TensorTypeNode>();
+ auto ttype = n->args[0]->type_as<TensorTypeNode>();
size_t n_dim = ttype->shape.size();
Expr expanded_bias = ExpandBiasToMatchAxis(call->args[1], n_dim, {param->axis});
Expr ret = Add(call->args[0], expanded_bias); |
|
Thanks @merrymercy. This change fixed the issue. Updated it this PR. |
* Add MXNet test example for relay * Fix a bug in BiasAddSimplifier
* Add MXNet test example for relay * Fix a bug in BiasAddSimplifier
|
This PR seems to be duplicated with https://github.com/dmlc/tvm/tree/master/tests/python/frontend/mxnet should we consolidate everything into one location? |
* Add MXNet test example for relay * Fix a bug in BiasAddSimplifier
* Add MXNet test example for relay * Fix a bug in BiasAddSimplifier
Uh oh!
There was an error while loading. Please reload this page.