Conversation
|
@pannous I do think |
There was a problem hiding this comment.
I kinda prefer the name log here, sticking with the name of the elementwise call, but either one is okay...
There was a problem hiding this comment.
not my choice -- the MKL function is vsLn/vdLn.
|
This looks good to me as a counterpart to As a more general issue affecting both of these, however, I wonder if this is the best way to resolve the tension between granularity and functionality. As far as I can tell, these layers could just implement simple |
|
@longjon yeah, I put the shift/scale fields in to match the existing |
|
Looks good to me.
This is fine. It could make for a nice warm-up PR in the future. @jeffdonahue rebase and merge away. |
|
Thanks for the review @shelhamer and @longjon! |
This adds
LogLayer, aNeuronLayer, which by default takes the natural log of its inputs. It's designed analogously toExpLayerandPowerLayer. (In general computeslog_{\gamma}(\alpha x + \beta)withlog_param { base: \gamma scale: \alpha shift: \beta }.)