FlattenLayer fix(?) -- top should always Share with bottom#3025
Open
jeffdonahue wants to merge 1 commit intoBVLC:masterfrom
Open
FlattenLayer fix(?) -- top should always Share with bottom#3025jeffdonahue wants to merge 1 commit intoBVLC:masterfrom
jeffdonahue wants to merge 1 commit intoBVLC:masterfrom
Conversation
everything in Reshape)
Contributor
|
I haven't thought carefully (yet?) about whether the current behavior is okay, but I agree that this patch changes the code to what I would expect. |
Member
|
Except for the "ownership" issue, in my opinion a layer should never change its bottom during reshape and forward (and perhaps never change its top during backward). Ideally we should have I advocate for putting sharing in Reshape instead of Forward and Backward; |
Contributor
|
@ronghanghu yes; for details on why we don't (currently?) have |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
I pulled this commit from #2033, where it is no longer necessary (the current version doesn't use
FlattenLayer), but in some previous version of my RNN implementation, this was causing incorrect behavior. The current implementation doesbottom[0]->ShareDiff(*top[0]), which means that ifbottom[0]has its own diffSyncedMemory, it gets deleted, which in #2033 must have interacted with my sharing of input diffs in weird ways.This changes the behavior to work the same way as
ReshapeLayer. It also seems somewhat more natural as thebottomblobs really "belong" to some other layer (whichever one outputs them astops), whereas a layer'stops sort of belong to it, so it seems less dangerous for a layer to delete the underlyingSyncedMemoryof its owntops. However, I don't know of any actual cases where this causes problems in current Caffe, so I'll leave it to others to decide whether this should be merged.