Skip to content

Conversation

@comaniac
Copy link
Contributor

nn.dropout is usually simplified in inference, but not for training, so we need to make up this attribute to make sure the model with nn.dropout can pass all Relay passes.

cc @anijain2305

@masahi masahi merged commit 4976bb2 into apache:main Mar 18, 2021
@comaniac comaniac deleted the fuse_pattern_dropout branch March 18, 2021 05:34
trevor-m pushed a commit to trevor-m/tvm that referenced this pull request May 6, 2021
trevor-m pushed a commit to neo-ai/tvm that referenced this pull request May 11, 2021
@Lyken17
Copy link
Contributor

Lyken17 commented Nov 26, 2021

Was this related with https://discuss.tvm.apache.org/t/toppattern-has-not-been-registered-for-nn-dropout/11305?

@AndrewZhaoLuo mentioned that

Dropout isn’t really supported

But actually here is an registration.

@AndrewZhaoLuo
Copy link
Contributor

Hey Lyken, this isn't really registering compute, instead it's registering an attribute saying this operator should not be fused with anything if I understand it correctly. Don't know the full context though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants