You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Nov 17, 2023. It is now read-only.
With MKLDNN (enabled in mxnet-cuxxmkl builds), using softrelu activation, the output is clipped to maximum value of 88.37625885009766. This regression was introduced in 1.2.0 when MKLDNN was enabled for activations other than relu in #10336.
Looking at MKLDNN implementation of softrelu, I don't see any overflow prevention logic, so most likely the exp() function overflows and results in this bug.
Description
With MKLDNN (enabled in mxnet-cuxxmkl builds), using softrelu activation, the output is clipped to maximum value of 88.37625885009766. This regression was introduced in 1.2.0 when MKLDNN was enabled for activations other than relu in #10336.
Looking at MKLDNN implementation of softrelu, I don't see any overflow prevention logic, so most likely the exp() function overflows and results in this bug.
Environment info (Required)
mxnet-cu90mkl >= 1.2.0b20180403
Minimum reproducible example
output:
@jinhuang415