Skip to content

Latest commit

 

History

History
27 lines (21 loc) · 1.09 KB

File metadata and controls

27 lines (21 loc) · 1.09 KB

Hyperparameter optimization for neural networks

This repository contains code to experiment with dlib's recently released global optimizer for neural network hyperparameter optimization.

Prerequisites

  • dlib: Install dlib by cloning the repository and following the instructions there.
  • TF slim: Clone the TF models repository. Add slim to your PYTHONPATH:
    export PYTHONPATH=$PYTHONPATH:/path_to_your_folder/models/research/slim
  • Python packages: Install all requirements via pip by
    pip install -r requirements.txt

Download the binary version of CIFAR-100 from here or run

wget http://www.cs.toronto.edu/~kriz/cifar-100-binary.tar.gz

Usage

For running optimization over the three hyperparameters depth_multiplier, weight_decay and dropout_keep_prob with default settings, run

python optimize.py --data_dir <DATA_DIR> --out_dir <OUT_DIR>