Conversation
Contributor
|
We are interested in #1482 and #1484 for the next release. This PRs were opened in November and already partially reviewed. We are available to reserve working time to finalize that if you have other feedbacks. /cc @mtamburrano |
Introduced by Layer type is a string #1694
Fix `draw` to support new protobuf format
To feed inputs of varying dimension, the `DATA` and `IMAGE_DATA` layer reshapes its prefetch and top blobs when the batch size is 1. The `BasePrefetchingDataLayer` always reshapes on forward.
Reshape single input batches for inputs of varying dimension
Improve CMake build with automation, options, and more.
Groom handling of encoded image inputs
No need to independently test static and dynamic linking.
Dynamic Linking
Contributor
|
@shelhamer |
Contributor
|
@ducha-aiki I would like to bet 10 bucks against us, too ;) |
The pyreformation #1703 aligned pycaffe Net with the real Caffe Net. This broke the pre-processing helpers that had been bolted on to `caffe.Net`. These responsibilities are gathered in `caffe.io.Transformer` instead. This is only an intermediate step in a real solution to pycaffe IO that should make use of the same data pipeline as the rest of the framework. See #1245 for further thoughts on input processing.
This is a quick translation of the examples to the caffe.io.Transformer interface. The results are not strictly identical to the earlier implementation! The models now use a mean pixel instead of a mean image for simplicity. The output classifications and detections are preserved but scores may differ. Note: the examples will be rewritten to make use of `caffe.Net` alone since it is the true interface, but not yet.
Stop-gap fixes for pycaffe
[cmake] CMAKE_SOURCE/BINARY_DIR to PROJECT_SOURCE/BINARY_DIR
warning is still useful to keep from accidentally running data without pre-processing.
relax MemoryData transform check to warning
This reverts the encoding cleanup since it breaks data processing for existing inputs as discussed in #1901.
add force_encoded_color flag to the data layer and warn about mixed encoding
Closed
* A sample code was added. * `slice_dim` and `slice_point` attributes were explained.
shelhamer
added a commit
that referenced
this pull request
Feb 20, 2015
Closed
Closed
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
A lot has happened since the last release! This packages up 400+ commits by 27 authors. Thanks all! A follow-up release will come in ~3 weeks to cover documentation, cuDNN v2, and as always more new features.
With all releases one should do
make clean && make supercleanto clear out old materials before compiling the new release to avoid inconsistencies.masterand releases are tagged. This is simpler and frees up maintainer time for real work.REGISTER_LAYER_CLASSfollowsINSTANTIATE_CLASS.Layerand allSolverto Python and implementLayersin Python via thePythonLayertype. This changes the pycaffe interface.Netand immutable. This is the first step to dismantling the Caffe singleton and prevents Phase missteps like data_transformers applies jitter in TEST phase #1430. This changes the Net constructor and pycaffe and matcaffe interfaces.LayerParametertype; adds amessage ParamSpecunifying parameter-related proto fields for a single parameter (name,blobs_lr,weight_decay, ...) into a single structure. You should update your model definitions.DataandImageDatalayers now reshape when the batch size == 1 to handle inputs of varying dimensionsCAFFEengine by skippingim2colwhen it's the identity.Fixes
caffe testwere unaffected. Fixed by Give phase to Net #1790.caffe.Net. Fixed by exposing these methods at the module level in Give phase to Net #1790. Callcaffe.set_{mode,device}before making aNet. This also solves issues with pycaffe and cuDNN.ImageDataLayerwould silently skip images it failed to load, but now it dies loudly.Documentation: The follow-up release will concentrate on pycaffe + matcaffe examples and interface documentation.
Dependencies: unchanged, except
Interface Change / Deprecation
upgrade_net_proto_{text,binary}.NetParameter::set_typeor theNet(string file, Phase phase)constructor Give phase to Net #1790.caffe.set_{mode,device}before making aNetand include the phase as an argument asnet = caffe.Net('def.prototxt', caffe.TRAINfor example.caffe.Transformer-- refer to the bundled examples andpython/caffe/io.pyfor details. See How to pycaffe #1774 for a suggestion oncaffe.Netusage.☕