Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
60 commits
Select commit Hold shift + click to select a range
38cab75
fix caffe.proto style bugs
jeffdonahue Mar 14, 2014
53ca9cd
create file caffe.proto.v0 which duplicates current caffe.proto
jeffdonahue Mar 14, 2014
1dc3374
move individual layer parameters to individual proto messages
jeffdonahue Mar 14, 2014
9d29f72
move LayerParameter and individual layer param messages to bottom of
jeffdonahue Mar 14, 2014
0fe0e15
remove LayerConnection from proto, bottom and top now in LayerParameter
jeffdonahue Mar 14, 2014
f39fa72
NetParameter.layers -> layer
jeffdonahue Mar 15, 2014
e05d0ab
HDF5DataParameter message and concat_param
jeffdonahue Mar 15, 2014
195d46a
add duplicated params from InnerProductParam to ConvolutionParam and
jeffdonahue Mar 15, 2014
ecb3873
changes to layers etc to make 'make all' run successfully under new
jeffdonahue Mar 15, 2014
1816561
update tests for new proto format; now they compile
jeffdonahue Mar 15, 2014
dbd616b
move caffe.proto.v0 -> deprecated/caffe.v0.proto and add separate target
jeffdonahue Mar 15, 2014
e681c84
add v0->v1 'bridge' proto and add util that uses it
jeffdonahue Mar 15, 2014
c6d47bc
caffe.proto: layer->layers
jeffdonahue Mar 15, 2014
69ddcc8
add V0NetParameter and UpgradeV0Net
jeffdonahue Mar 15, 2014
be8d836
make ReadProtoFromTextFile not die on parse failure; add
jeffdonahue Mar 15, 2014
62d7f5c
add deprecated protos to PROTO_OBJS in makefile so things compile; other
jeffdonahue Mar 15, 2014
7e25c8c
make solver use upgrade_proto (by constructing net with a string) and
jeffdonahue Mar 15, 2014
17764c3
function to upgrade padding layers
jeffdonahue Mar 17, 2014
19cc990
set correct bottom blob name in upgraded conv layer
jeffdonahue Mar 17, 2014
72d3183
imagenet padding upgrade test
jeffdonahue Mar 17, 2014
1a68b3d
more padding layer upgrade tests
jeffdonahue Mar 17, 2014
3b09e50
add imagenet upgrade test and fix bug in upgrade_proto
jeffdonahue Mar 17, 2014
24e5e52
add test which includes upgraded params
jeffdonahue Mar 17, 2014
b95dc5f
fix insert_splits for new layer param format
jeffdonahue Mar 17, 2014
461b874
add upgrade_net_proto tool
jeffdonahue Mar 17, 2014
6de9f0a
add test for input/input_dim and fix bug, wasn't copying input
jeffdonahue Mar 17, 2014
6179062
put inputs before layers in the proto so they print in that order
jeffdonahue Mar 17, 2014
2e33058
LayerType enum
jeffdonahue Mar 17, 2014
1fbb5bb
convert existing models to new format (used tools/upgrade_net_proto with
jeffdonahue Mar 17, 2014
35ad409
fix post-rebase param bugs
jeffdonahue Mar 17, 2014
2e3e065
fix lint errors
jeffdonahue Mar 17, 2014
e0fbd48
fix layertype alphabetization
jeffdonahue Mar 17, 2014
a566570
rebase and fix stuff, incorporate image and padding layers
jeffdonahue Mar 18, 2014
9291f21
incorporate WindowDataLayer
jeffdonahue Mar 21, 2014
a0cba16
fix test_net for refactor
jeffdonahue Mar 21, 2014
dc1b25e
remove padding layer
jeffdonahue Mar 21, 2014
fb25211
some cleanup - lowercase layer class member variable names
jeffdonahue Mar 21, 2014
a68395c
alphabetize classes in vision_layers.hpp
jeffdonahue Mar 21, 2014
2aea6bb
some naming standardization: ImagesLayer -> ImageDataLayer (like other
jeffdonahue Mar 21, 2014
cdbb77c
make test_protobuf use NONE for dummy layer instead of SPLIT
jeffdonahue Mar 21, 2014
4667f0e
update deprecated protos to latest dev versions
jeffdonahue Mar 21, 2014
335697f
incorporate WindowDataLayer into V0Upgrade and add tests
jeffdonahue Mar 21, 2014
76e22ce
fix upgrade_net_proto name
jeffdonahue Mar 21, 2014
1b30e0a
upgrade_net_proto: allow input files already in new proto format
jeffdonahue Mar 21, 2014
fafa505
upgrade remaining prototxts
jeffdonahue Mar 21, 2014
c1fa11d
upgrade images layer
jeffdonahue Mar 23, 2014
17a59c3
make all tools backwards compatible with v0 net param
jeffdonahue Mar 23, 2014
85b9e82
regenerate imagenet_val feature extraction prototxt with missing
jeffdonahue Mar 24, 2014
927642e
allow upgrade_net_proto to also read/write binary protos (e.g. saved
jeffdonahue Mar 24, 2014
23bfeeb
add NetParameter required version number as breaking change for
jeffdonahue Mar 24, 2014
8198585
rollback previous commit adding version number to NetParameter -- going
jeffdonahue Mar 24, 2014
b7444d6
some post rebase fixes -- copyright, hdf5_output layer (still need to
jeffdonahue Mar 27, 2014
57167a0
cleaner version of refactoring with fields added to LayerConnection
jeffdonahue Mar 27, 2014
1d27409
add support for hdf5 output layer
jeffdonahue Mar 27, 2014
12d88d5
minor cleanup
jeffdonahue Mar 27, 2014
56ce562
fix upgrade_net_proto names
jeffdonahue Mar 27, 2014
e20c43a
update docs (and a couple comments) for refactored layerparam
jeffdonahue Mar 27, 2014
48e2e8e
add NetParameterPrettyPrint so that upgrade tool prints inputs before
jeffdonahue Mar 28, 2014
9f92159
move ReadNetParamsFrom{Text,Binary}File into util
jeffdonahue Mar 28, 2014
d821caf
rename test_innerproduct_layer to test_inner_product_layer
jeffdonahue Mar 28, 2014
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/feature_extraction.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ We're going to use the images that ship with caffe.

find `pwd`/examples/images -type f -exec echo {} \; > examples/_temp/temp.txt

The `ImagesLayer` we'll use expects labels after each filenames, so let's add a 0 to the end of each line
The `ImageDataLayer` we'll use expects labels after each filenames, so let's add a 0 to the end of each line

sed "s/$/ 0/" examples/_temp/temp.txt > examples/_temp/file_list.txt

Expand All @@ -37,7 +37,7 @@ Download the mean image of the ILSVRC dataset.
We will use `data/ilsvrc212/imagenet_mean.binaryproto` in the network definition prototxt.

Let's copy and modify the network definition.
We'll be using the `ImagesLayer`, which will load and resize images for us.
We'll be using the `ImageDataLayer`, which will load and resize images for us.

cp examples/feature_extraction/imagenet_val.prototxt examples/_temp

Expand Down
2 changes: 1 addition & 1 deletion docs/imagenet_training.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ Network Definition
The network definition follows strictly the one in Krizhevsky et al. You can find the detailed definition at `examples/imagenet/imagenet_train.prototxt`. Note the paths in the data layer - if you have not followed the exact paths in this guide you will need to change the following lines:

source: "ilvsrc12_train_leveldb"
meanfile: "../../data/ilsvrc12/imagenet_mean.binaryproto"
mean_file: "../../data/ilsvrc12/imagenet_mean.binaryproto"

to point to your own leveldb and image mean. Likewise, do the same for `examples/imagenet/imagenet_val.prototxt`.

Expand Down
60 changes: 28 additions & 32 deletions docs/mnist_prototxt.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,11 @@ Writing the Data Layer
Currently, we will read the MNIST data from the leveldb we created earlier in the demo. This is defined by a data layer:

layers {
layer {
name: "mnist"
type: "data"
name: "mnist"
type: DATA
data_param {
source: "mnist-train-leveldb"
batchsize: 64
batch_size: 64
scale: 0.00390625
}
top: "data"
Expand All @@ -35,9 +35,11 @@ Writing the Convolution Layer
Let's define the first convolution layer:

layers {
layer {
name: "conv1"
type: "conv"
name: "conv1"
type: CONVOLUTION
blobs_lr: 1.
blobs_lr: 2.
convolution_param {
num_output: 20
kernelsize: 5
stride: 1
Expand All @@ -47,8 +49,6 @@ Let's define the first convolution layer:
bias_filler {
type: "constant"
}
blobs_lr: 1.
blobs_lr: 2.
}
bottom: "data"
top: "conv1"
Expand All @@ -65,10 +65,10 @@ Writing the Pooling Layer
Phew. Pooling layers are actually much easier to define:

layers {
layer {
name: "pool1"
type: "pool"
kernelsize: 2
name: "pool1"
type: POOLING
pooling_param {
kernel_size: 2
stride: 2
pool: MAX
}
Expand All @@ -82,21 +82,21 @@ Similarly, you can write up the second convolution and pooling layers. Check `da

Writing the Fully Connected Layer
----------------------------------
Writing a fully connected layers is also simple:
Writing a fully connected layer is also simple:

layers {
layer {
name: "ip1"
type: "innerproduct"
name: "ip1"
type: INNER_PRODUCT
blobs_lr: 1.
blobs_lr: 2.
inner_product_param {
num_output: 500
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
blobs_lr: 1.
blobs_lr: 2.
}
bottom: "pool2"
top: "ip1"
Expand All @@ -109,10 +109,8 @@ Writing the ReLU Layer
A ReLU Layer is also simple:

layers {
layer {
name: "relu1"
type: "relu"
}
name: "relu1"
type: RELU
bottom: "ip1"
top: "ip1"
}
Expand All @@ -122,18 +120,18 @@ Since ReLU is an element-wise operation, we can do *in-place* operations to save
After the ReLU layer, we will write another innerproduct layer:

layers {
layer {
name: "ip2"
type: "innerproduct"
name: "ip2"
type: INNER_PRODUCT
blobs_lr: 1.
blobs_lr: 2.
inner_product_param {
num_output: 10
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
blobs_lr: 1.
blobs_lr: 2.
}
bottom: "ip1"
top: "ip2"
Expand All @@ -144,10 +142,8 @@ Writing the Loss Layer
Finally, we will write the loss!

layers {
layer {
name: "loss"
type: "softmax_loss"
}
name: "loss"
type: SOFTMAX_LOSS
bottom: "ip2"
bottom: "label"
}
Expand Down
Loading