fix layerSetUp of scale_layer to not add bias blob when already present#4600
fix layerSetUp of scale_layer to not add bias blob when already present#4600jeffdonahue merged 1 commit intoBVLC:masterfrom
Conversation
|
Thanks for the fix @bwilbertz. Can't this happen for the scale parameter as well, though? It seems like you could just skip all parameter setup whenever |
|
The thing is a bit tricky here, because we derive from Having understood a bit more the case
into
in order to treat both cases correctly. If you agree, I would update the PR accordingly. Regarding the initialization of the scale blob: I think, this is OK like it is. |
3e34ce0 to
4665bae
Compare
|
@jeffdonahue I have updated the PR according to my last comment. Did you have any chance to look at it? Thanks, Benedikt |
src/caffe/layers/scale_layer.cpp
Outdated
| bias_param_id_ = this->blobs_.size(); | ||
| this->blobs_.resize(bias_param_id_ + 1); | ||
| this->blobs_[bias_param_id_] = bias_layer_->blobs()[0]; | ||
| if ( this->blobs_.size() + bottom.size() < 3 ) { |
There was a problem hiding this comment.
rm leading/trailing whitespace (use if (x) rather than if ( x ))
|
Ok, thanks for the explanation and fix @bwilbertz -- this makes sense now. It seems a bit fragile and difficult to understand, and would ideally have unit tests, but that's unfortunately just kind of the state of things across Caffe right now... Maybe you could add a comment in at least one of the cases, e.g., for the |
4665bae to
52a647e
Compare
|
@jeffdonahue PR is updated. Thanks, Benedikt |
52a647e to
9bc83e3
Compare
|
Hi @jeffdonahue. Is there anything I can improve to get this PR merged? Thanks, |
|
Nope -- thanks for the fixes, and apologies for the delay! |
In case a net is loaded, which has already be initialized and contains valid blob data, the layerSetUp of the scale_layer would add the blob for the bias term a second time (resulting in three blobs instead of only two).
This PR only adds the blob, if it was not present before.
@jeffdonahue I think the original code of the scale layer was written by you. Could you have a look, if this fix does not miss any case which you had in mind.
Thanks,
Benedikt