Skip to content

benchmark case failed #96

@liqiangxl

Description

@liqiangxl

(1) easy fix for setupDivMaxSoftmaxDropoutForward, just change contiguity({true, false, false, true} to contiguity({true, c10::nullopt, c10::nullopt, true}
(2) failed in BiasDropoutAddLayernormBwd1_fp32 in assert:

      TORCH_INTERNAL_ASSERT(
          val.has_value(),
          "Tried to evaluate the extent, ",
          extent->toInlineString(),
          " for the ptype: ",
          p_type,
          " to set launch bounds but could not.");

works fine if skip this assert by:

      if(!val.has_value()) {
        continue;
      }

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions