Skip to content

Shape inference time doubled after #649 #749

@zasdfgbnm

Description

@zasdfgbnm

Before #649:

-------------------------------------------------------------------------------------------
Benchmark                                                 Time             CPU   Iterations
-------------------------------------------------------------------------------------------
LayerNormBackward_ShapeInference                       87.8 us         87.6 us         7758
LayerNormForward_ShapeInference                        29.9 us         29.8 us        22938
LayerNormBackward_NoShapeInferenceCachedBaseline       36.5 us         36.5 us        18906
LayerNormForward_NoShapeInferenceCachedBaseline        21.4 us         21.3 us        33063

After:

-------------------------------------------------------------------------------------------
Benchmark                                                 Time             CPU   Iterations
-------------------------------------------------------------------------------------------
LayerNormBackward_ShapeInference                        162 us          162 us         4332
LayerNormForward_ShapeInference                        50.3 us         50.2 us        13811
LayerNormBackward_NoShapeInferenceCachedBaseline        111 us          111 us         6304
LayerNormForward_NoShapeInferenceCachedBaseline        42.4 us         42.3 us        16632

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions