Skip to content
This repository was archived by the owner on Nov 17, 2023. It is now read-only.
This repository was archived by the owner on Nov 17, 2023. It is now read-only.

softrelu activation clipping bug in MKLDNN #11804

@safrooze

Description

@safrooze

Description

With MKLDNN (enabled in mxnet-cuxxmkl builds), using softrelu activation, the output is clipped to maximum value of 88.37625885009766. This regression was introduced in 1.2.0 when MKLDNN was enabled for activations other than relu in #10336.

Looking at MKLDNN implementation of softrelu, I don't see any overflow prevention logic, so most likely the exp() function overflows and results in this bug.

Environment info (Required)

mxnet-cu90mkl >= 1.2.0b20180403

Minimum reproducible example

nd.Activation(nd.array([100]), act_type='softrelu')

output:

[ 88.37625885]
<NDArray 1 @cpu(0)>

@jinhuang415

Metadata

Metadata

Assignees

No one assigned

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions