My model runs fine without hybridize, but I need the speed boost.
File "normal_def/train.py", line 225, in <module>
train()
File "normal_def/train.py", line 178, in train
users, items, item_counts, set_sizes, user_meals)
File "/usr/local/anaconda3/lib/python3.6/site-packages/mxnet/gluon/block.py", line 413, in __call__
return self.forward(*args)
File "/usr/local/anaconda3/lib/python3.6/site-packages/mxnet/gluon/block.py", line 621, in forward
return self._call_cached_op(x, *args)
File "/usr/local/anaconda3/lib/python3.6/site-packages/mxnet/gluon/block.py", line 528, in _call_cached_op
out = self._cached_op(*cargs)
File "/usr/local/anaconda3/lib/python3.6/site-packages/mxnet/_ctypes/ndarray.py", line 149, in __call__
ctypes.byref(out_stypes)))
File "/usr/local/anaconda3/lib/python3.6/site-packages/mxnet/base.py", line 149, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
mxnet.base.MXNetError: Error in operator normaldef0_fastpoissonlogprob0__mul0: [17:27:23] src/operator/nn/../tensor/../elemwise_op_common.h:123: Check failed: assign(&dattr, (*vec)[i]) Incompatible attr in node normaldef0_fastpoissonlogprob0__mul0 at 1-th input: expected [9963,128], got [9963,1]
There are many multiplies and nodes of that shape. How can I figure out why hybridize is not working? i.e. what line of code normaldef0_fastpoissonlogprob0__mul0 corresponds to?