Skip to content

[Bug] Error converting operator CumSum when 'reverse=1': (ptr) is false: The struct_info is not populated, check if you have normalized the expr #18135

@coffezhou

Description

@coffezhou

Expected behavior

TVM should run the model correctly.

Actual behavior

For the following model,

Image

it can be executed by onnxruntime, the results are as follows:

ONNXRuntime:
 [array([[3.2746487 , 2.0243466 , 0.8304557 , 0.55226177],
       [0.48739833, 0.47312835, 0.3515296 , 0.19696969],
       [9.719148  , 7.0277977 , 4.5064907 , 0.13069437]], dtype=float32)]

However, the onnx frontend of TVM cannot import it:

File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3925, in from_onnx
    return g.from_onnx(graph, opset)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3556, in from_onnx
    self._construct_nodes(graph)
  File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3736, in _construct_nodes
    raise err
  File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3731, in _construct_nodes
    op = self._convert_operator(op_name, inputs, attr, self.opset)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3831, in _convert_operator
    sym = op_function(self.bb, inputs, attrs, [self._nodes, self._params])
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 1343, in _impl_v14
    data = bb.emit_te(topi.flip, data, axis=axis if axis else 0)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/block_builder.py", line 540, in emit_te
    return self.emit(self.call_te(func, *args, **kwargs), name_hint=name_hint)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/block_builder.py", line 356, in call_te
    tir_func, call_args, output_sinfo, tir_vars = gen_call_tir_inputs(func, *args, **kwargs)
                                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/utils.py", line 351, in gen_call_tir_inputs
    te_args = _convert_te_arg(args)
              ^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/utils.py", line 289, in _convert_te_arg
    new_arg = _convert_te_arg_helper(te_args)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/utils.py", line 273, in _convert_te_arg_helper
    return tuple(_convert_te_arg_helper(x) for x in arg)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/utils.py", line 273, in <genexpr>
    return tuple(_convert_te_arg_helper(x) for x in arg)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/utils.py", line 223, in _convert_te_arg_helper
    if isinstance(arg.struct_info, TensorStructInfo):
                  ^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/ir/expr.py", line 59, in struct_info
    return _ffi_api.ExprStructInfo(self)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "tvm/ffi/cython/./function.pxi", line 228, in tvm.ffi.core.Function.__call__
tvm.error.InternalError: Check failed: (ptr) is false: The struct_info is not populated, check if you have normalized the expr

Environment

OS: Ubuntu 20.04
TVM: 0.22.dev0 (c6969d7)
onnxruntime: 1.21.0

Steps to reproduce

This bug can be reproduced by the following code with the model in the attachment. As shown in the code, the model can be executed by onnxruntime. However, TVM cannot import this model.

import sys

import numpy as np
import onnx
import onnxruntime

import tvm
from tvm import relax
from tvm.relax.frontend.onnx import from_onnx

import pickle

            
def main():
    onnx_model = onnx.load("111.onnx")
    
    with open("inputs.pkl", "rb") as fp:
        inputs = pickle.load(fp)

    try:
        ort_session = onnxruntime.InferenceSession(
            onnx_model.SerializeToString(), providers=["CPUExecutionProvider"]
        )
        ort_output = ort_session.run([], inputs)
    except Exception as e:
        print(e)
        sys.exit(1)
        
    print("ONNXRuntime:\n", ort_output)   

    # Convert the onnx model into relax through the onnx importer.
    tvm_model = from_onnx(onnx_model, keep_params_in_input=True)
    

    
if __name__ == "__main__":
    
    main()

testcase.zip

Triage

Please refer to the list of label tags here to find the relevant tags and add them below in a bullet format (example below).

  • needs-triage

Metadata

Metadata

Assignees

No one assigned

    Labels

    needs-triagePRs or issues that need to be investigated by maintainers to find the right assignees to address ittype: bug

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions