Give the python layer parameter/weight blobs.#2944
Conversation
4d189ef to
90963df
Compare
|
Generally for this purpose I am using #2079, which ought to eventually in some way in accord with #1474 provide a more natural way to deal with parameters, esp. when using net spec. But this seems like a reasonable exposure of the current state affairs. It would be nice if For merge:
|
|
Fixed the style and added some tests. |
2657df9 to
60c0d58
Compare
|
Looks good, thanks @philkr! |
Give the python layer parameter/weight blobs.
|
Ah, I just merged this, so this comment is too late, but one more thing for the future: please squash style changes (here they appear in the test commit, which is confusing). |
|
Thank you very much. Exactly what I have been looking for It terns out that this is perfectly what I want! the solver do update the weights!. I tested it by setting initial weights to zeros and backward with ones. the result diff is with 0-base_lr*1. |
This PR allows the python layer to have its own parameter/weight blobs.
As any python layer inherits form caffe.Layer it already has a self.blobs variable that gets saved and loaded properly, however it wasn't possible to add a new blob to that BlobVec. This PR adds a function
add_blobto the BlobVec which allows a python layer to extend the vector with a new blob. The arguments of theadd_blobfunction are the dimensions of the blob to be added.Here is a simple layer that demonstrates the use.