Conversation
4ae229e to
4eae3c1
Compare
1c64588 to
fdeb483
Compare
src/caffe/blob.cpp
Outdated
Contributor
Author
There was a problem hiding this comment.
I don't like writing or reading these duplicate methods for data and diff -- should these be in SyncedMemory, or perhaps a new abstraction layer between SyncedMemory and Blob?
fdeb483 to
401e43f
Compare
57c2992 to
cf79ed9
Compare
c7c283b to
6789a00
Compare
6789a00 to
f38ddef
Compare
Contributor
Author
|
Merging after discussion with @shelhamer. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This lets you add a field
clip_gradientsto aSolverParameterto set the maximum L2 norm of the parameter gradients. If the norm of the gradients exceeds the value set, the gradients are scaled down to have exactly that L2 norm. The L2 norm is computed prior to applying learning rate, etc. -- it applies to the actual loss gradient.