HDF5DataLayer: read matrix of features and labels from HDF5 file as input#147
Conversation
HDF5DataLayer: read matrix of features and labels from HDF5 file as input
HDF5DataLayer: read matrix of features and labels from HDF5 file as input
|
Need to update the Caffe Installation to mention which hdf5 package need to be installed like: |
HDF5DataLayer: read matrix of features and labels from HDF5 file as input
|
@sergeyk Actually I cannot compile this version, could you tell me which libraries I need to install? |
|
He almost certainly linked against anaconda. |
|
Nope, I'm running all homebrew python. It's simply brew install hdf5 on On Tuesday, February 25, 2014, Evan Shelhamer notifications@github.com
|
|
I tried apt-get install libhdf5-serial-dev and still doesn't work, at least in ubuntu 12.04 |
|
Not sure what to tell you. and so on |
|
Actually, I gey these linker errors too, wonder what's up |
|
That's the error I get Sergio 2014-02-25 19:25 GMT-08:00 Sergey Karayev notifications@github.com:
|
|
Check the order of the includes and library paths in the makefile. You could be building against the anaconda version but linking against the /usr version or vice-versa. I ran into something like this before. |
|
The problem is solved in the commit 91eb77e "Reverse the order of hdf5_hl hdf5 as LIBRARIES in Makefile" in PR #159. The library that is depended on should be placed after the depending ones. |
|
@kloudkl good catch. Now it compiles in Ubuntu too @sergeyk @shelhamer Cheers!! |
HDF5DataLayer: read matrix of features and labels from HDF5 file as input
HDF5DataLayer: read matrix of features and labels from HDF5 file as input
HDF5DataLayer: read matrix of features and labels from HDF5 file as input

This is useful such that we can use Caffe for classification purposes -- for this use case, we don't usually need the image-specific considerations that DataLayer is full of.
This adds a dependency on HDF5.
Future commits will add ability to read HDF5 files partially, and to iterate over multiple files in a directory.