Skip to content

Comments

Fix dropout backward in TEST phase#502

Merged
sguada merged 5 commits intoBVLC:devfrom
sguada:fix_dropout_backward
Jun 28, 2014
Merged

Fix dropout backward in TEST phase#502
sguada merged 5 commits intoBVLC:devfrom
sguada:fix_dropout_backward

Conversation

@sguada
Copy link
Contributor

@sguada sguada commented Jun 14, 2014

This PR just add the option to pass gradients backwards during the TEST phase.

@sguada
Copy link
Contributor Author

sguada commented Jun 14, 2014

@jeffdonahue I wasn't sure if instead of copying the top_diff to the bottom_diff, now that could be replaced with ShareDiff

@sguada sguada mentioned this pull request Jun 14, 2014
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this case really be in the tests? You can just skip this test with a filter on a machine where you don't want to run it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just copied the code from the TRAIN mode, so don't know who add that in the first place. Maybe we can remove it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Yangqing I checked the log and it seems that you added the check of major >=2 #502 (diff)

Should we keep it or remove it?

@shelhamer
Copy link
Member

@sguada I agree we should be able to compute the gradient in test, but I'm less sure I'm comfortable with the copy and pointer checking. Please rebase for a clean merge too.

@sguada
Copy link
Contributor Author

sguada commented Jun 28, 2014

@shelhamer talking with @jeffdonahue we thought it was safer to just do copy, and add the logic to copy to avoid copying when source and destination are the same.

I wasn't sure if it was safe to use ShareDiff in the backward or ShareData in the forward.

@sguada
Copy link
Contributor Author

sguada commented Jun 28, 2014

I think there could a case where during TRAIN the data is not shared, but then after running in TEST mode becomes shared, so once is back to TRAIN is shared. Although this case may never appear since we usually use this layer in-place.

@shelhamer
Copy link
Member

Ok, fair enough! Rebase and merge as you like.

On Fri, Jun 27, 2014 at 6:28 PM, Sergio Guadarrama <notifications@github.com

wrote:

I think there could a case where during TRAIN the data is not shared, but
then after running in TEST mode becomes shared, so once is back to TRAIN is
shared. Although this case may never appear since we usually use this layer
in-place.


Reply to this email directly or view it on GitHub
#502 (comment).

@sguada
Copy link
Contributor Author

sguada commented Jun 28, 2014

It is rebased and ready to merge, once @Yangqing confirm that the cuda.major >=2 should stay or can be removed now.

@sguada
Copy link
Contributor Author

sguada commented Jun 28, 2014

cuda.major >= 2 has been removed from the test_neuron_layer

sguada added a commit that referenced this pull request Jun 28, 2014
Fix dropout backward in TEST phase
@sguada sguada merged commit 6ce346f into BVLC:dev Jun 28, 2014
mitmul pushed a commit to mitmul/caffe that referenced this pull request Sep 30, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants