Add parameter layer for learning any bottom#2079
Conversation
|
Cool, this looks good to me -- feel free to merge as you see fit. (I did realize after our discussion, that this couldn't just be used with an |
af7b50e to
14295e0
Compare
|
This has now been rebased, as I'm continuing to use it in some of my models. I'll plan to merge soon per @jeffdonahue's prior approval, unless there are any further comments (in particular, if |
|
still LGTM |
|
Fine by me. This does come up from time to time so let's merge. |
|
Okay, thanks for the eyes, merging this simple layer which I've made quite a bit of use of myself. |
Add parameter layer for learning any bottom
(This is a minimal step in the direction of #1474. From discussion with @jeffdonahue.)
This layer simply holds a parameter blob of user-defined shape, and shares it as its single top.
This is useful if you want to learn disconnected bottoms in a net. In theory, all parameters could be handled this way, which would be a full realization of #1474. It is, however, not clear that that's the right thing to do from a user interface perspective: params would lose their semantic distinction, and their sizes would end up being double-specified since current layers already compute them.
Whether or not this ends up being on the solution path, I wanted to go ahead and throw up this PR since I'm already making use of it with other, future PRs.