This is a weird problem.
When using imagenet_deploy.txt containing weight_filler and/or bias_filler in a layer then the net.init() process don't continue.
When not using weight_filler and/or bias_filler in the layers then net.init() works well but then when restoring the trained model there is dimension mismatch in fc6 layer.