Skip to content

Fix the bug of incorrect gradient when loss_weight set to zero#3868

Closed
Cysu wants to merge 4 commits intoBVLC:masterfrom
Cysu:fix-zero-loss-weight
Closed

Fix the bug of incorrect gradient when loss_weight set to zero#3868
Cysu wants to merge 4 commits intoBVLC:masterfrom
Cysu:fix-zero-loss-weight

Conversation

@Cysu
Copy link

@Cysu Cysu commented Mar 22, 2016

This PR aims at fixing the bug raised in #2895.

A flag (diff_written_) is added to the Blob class to indicate whether its diff has been used. When BP, clear the bottom diffs (if necessary) of the layers that do not propagate down. The computation and memory overhead are negligible.

@shaibagon
Copy link
Member

thanks!

@seanbell seanbell added the bug label Apr 1, 2016
@dangweili
Copy link

Hope this bug could be solved soon

@dangweili
Copy link

Does any other loss functions meets the same bug when the loss_weight is setted to be zero?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants

Comments