Skip to content

Got an error when running bundle inference with TensorRT torchscript #6124

@binliunls

Description

@binliunls

Describe the bug
When I run the bundle inference with TensorRT based torchscript models, the below error appeared.

  File "/usr/local/lib/python3.8/dist-packages/monai/handlers/stats_handler.py", line 179, in exception_raised
    raise e
  File "/usr/local/lib/python3.8/dist-packages/ignite/engine/engine.py", line 1068, in _run_once_on_dataset_as_gen
    self.state.output = self._process_function(self, self.state.batch)
  File "/usr/local/lib/python3.8/dist-packages/monai/engines/evaluator.py", line 300, in _iteration
    with engine.mode(engine.network):
  File "/usr/lib/python3.8/contextlib.py", line 113, in __enter__
    return next(self.gen)
  File "/usr/local/lib/python3.8/dist-packages/monai/networks/utils.py", line 389, in eval_mode
    training = [n for n in nets if n.training]
  File "/usr/local/lib/python3.8/dist-packages/monai/networks/utils.py", line 389, in <listcomp>
    training = [n for n in nets if n.training]
  File "/usr/local/lib/python3.8/dist-packages/torch/jit/_script.py", line 785, in __getattr__
    return super(RecursiveScriptModule, self).__getattr__(attr)
  File "/usr/local/lib/python3.8/dist-packages/torch/jit/_script.py", line 502, in __getattr__
    return super(ScriptModule, self).__getattr__(attr)
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1587, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'RecursiveScriptModule' object has no attribute 'training'

This should relate to the with engine.mode(engine.network): code here, in which the engine.network.training method of the torchscript model is directly used without check. I think we should add some checks in the evaluator.py file to avoid this issue.

To Reproduce
Steps to reproduce the behavior:

  1. Start a MONAI docker.
  2. Convert a MONAI bundle in model zoo to a TensorRT based torchscript.
  3. Add import torch_tensorrt to the import part of inference.json.
  4. Modify the network_def to torch.jit.load(<model_name>).
  5. Disable the CheckpointLoader handler.
  6. Run the command python -m monai.bundle run evaluating --meta_file configs/metadata.json --config_file configs/inference.json --logging_file configs/logging.conf

Expected behavior
Run the bundle and get the inference results.

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions