Skip to content

Comments

FlattenLayer fix(?) -- top should always Share with bottom#3025

Open
jeffdonahue wants to merge 1 commit intoBVLC:masterfrom
jeffdonahue:flatten-fix
Open

FlattenLayer fix(?) -- top should always Share with bottom#3025
jeffdonahue wants to merge 1 commit intoBVLC:masterfrom
jeffdonahue:flatten-fix

Conversation

@jeffdonahue
Copy link
Contributor

I pulled this commit from #2033, where it is no longer necessary (the current version doesn't use FlattenLayer), but in some previous version of my RNN implementation, this was causing incorrect behavior. The current implementation does bottom[0]->ShareDiff(*top[0]), which means that if bottom[0] has its own diff SyncedMemory, it gets deleted, which in #2033 must have interacted with my sharing of input diffs in weird ways.

This changes the behavior to work the same way as ReshapeLayer. It also seems somewhat more natural as the bottom blobs really "belong" to some other layer (whichever one outputs them as tops), whereas a layer's tops sort of belong to it, so it seems less dangerous for a layer to delete the underlying SyncedMemory of its own tops. However, I don't know of any actual cases where this causes problems in current Caffe, so I'll leave it to others to decide whether this should be merged.

@longjon
Copy link
Contributor

longjon commented Sep 4, 2015

I haven't thought carefully (yet?) about whether the current behavior is okay, but I agree that this patch changes the code to what I would expect.

@ronghanghu
Copy link
Member

Except for the "ownership" issue, in my opinion a layer should never change its bottom during reshape and forward (and perhaps never change its top during backward). Ideally we should have const Blob<Dtype>* instead of Blob<Dtype>* for bottom blobs in Reshape and Forward.

I advocate for putting sharing in Reshape instead of Forward and Backward;

@longjon
Copy link
Contributor

longjon commented Sep 4, 2015

@ronghanghu yes; for details on why we don't (currently?) have const Blob* bottoms in forward, see #945.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants