Skip to content

[Relay][Frontend][Pytorch] Prelu definition mismatch in pytorch  #8184

@YuhengHuang42

Description

@YuhengHuang42

Description

A similar bug was found in ONNX model, and fixed by this PR: #7208

However, for Pytorch importation, the bug still exists.

By definition: PRelu definition in pytorch, num_parameters can be set at 1 or the number of channels at input.

However, currently PReLU in TVM seems to support num_parameters = number of channels only. So there will be an error if you set num_parameters = 1 while give input channels > 1.

And please notice that by default Pytorch set the num_parameters = 1.

(Currently, the workaround is to turn Pytorch model to ONNX model first, and then import the ONNX model.)

Code to reproduce

import torch
import tvm 
from tvm import relay

minimal_example = torch.nn.Sequential(
    torch.nn.PReLU(num_parameters=1)
)
minimal_example.eval()

input_shape = (1, 6, 10, 10) 
random_input = torch.randn(input_shape)
trace = torch.jit.trace(minimal_example, random_input)
input_info = [("input0", input_shape)]
mod, params = tvm.relay.frontend.from_pytorch(trace, input_info)

Environment

TVM version: 0.8.dev0 at cc3d60e

Pytorch version: 1.8.1

OS version: macOS 10.15.7

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions