From 39054b349e83ead13127cd1bd6b90e3141bc0451 Mon Sep 17 00:00:00 2001 From: Amol Lele <19983848+leleamol@users.noreply.github.com> Date: Thu, 4 Oct 2018 10:53:53 -0700 Subject: [PATCH 1/4] Updating the readme file for cpp-package and adding readme file for example directory. --- cpp-package/README.md | 45 ++++++++++---- cpp-package/example/README.md | 112 ++++++++++++++++++++++++++++++++++ 2 files changed, 145 insertions(+), 12 deletions(-) create mode 100644 cpp-package/example/README.md diff --git a/cpp-package/README.md b/cpp-package/README.md index 2b6e0e39f0fd..f7e48ddd7775 100644 --- a/cpp-package/README.md +++ b/cpp-package/README.md @@ -1,21 +1,42 @@ # MXNet C++ Package -To build the C++ package, please refer to [this guide](). +The MXNet C++ Package provides C++ API bindings to the users of MXNet. Currently, these bindings are not available as standalone package. +The users of these bindings are required to build this package as mentioned below. -A basic tutorial can be found at . +## Building C++ Package -The example directory contains examples for you to get started. +The cpp-package directory contains the implementation of C++ API. As mentioned above, users are required to build this directory or package before using it. +**The cpp-package is built while building the MXNet shared library, *libmxnet.so*.** + +###Steps to build the C++ package: +1. Building the MXNet C++ package requires building MXNet from source. +2. Clone the MXNet github repository **recursively** to ensure the code in submodules is available for building MXNet. +3. Install the [prerequisites](), desired [BLAS libraries]() and optional [OpenCV, CUDA, and cuDNN]() for building MXNet from source. +4. There is a configuration file for make, [make/config.mk]() that contains all the compilation options. You can edit this file and set the appropriate options prior to running the **make** command. +5. Please refer to [platfrom specific build instructions]() and available [build configurations](https://mxnet.incubator.apache.org/install/build_from_source#build-configurations) for more details. +5. For enabling the build of C++ Package, set the **USE__CPP__PACKAGE = 1** in [make/config.mk](). Optionally, the compilation flag can also be specified on **make** command line as follows: + ``` + make -j USE_CPP_PACKAGE=1 + ``` + +## Usage + +In order to consume the C++ API please follow the steps below -## Building C++ examples in examples folder +1. Ensure that the MXNet shared library is built from source with the **USE__CPP__PACKAGE = 1**. +2. Include the [MxNetCpp.h]() in the program that is going to consume MXNet C++ API. + ``` + #include + ``` +3. While building the program, ensure that the correct paths to the directories containing header files and MxNet shared library. +4. The program links MxNet shared library dynamically. Hence the library needs to be accessible to the program during the runtime. This can be achieved by including the path to shared library to environment variable such as LD_LIBRARY_PATH. -From cpp-package/examples directory -- Build all examples in release mode: **make all** -- Build all examples in debug mode : **make debug** -By default, the examples are build to be run on GPU. -To build examples to run on CPU: -- Release: **make all MXNET_USE_CPU=1** -- Debug: **make debug MXNET_USE_CPU=1** +## Tutorial +A basic tutorial can be found at . + +## Examples + +The example directory contains examples for you to get started. -The makefile will also download the necessary data files and store in data folder. (The download will take couple of minutes, but will be done only once on a fresh installation.) diff --git a/cpp-package/example/README.md b/cpp-package/example/README.md new file mode 100644 index 000000000000..7effdeb40eb7 --- /dev/null +++ b/cpp-package/example/README.md @@ -0,0 +1,112 @@ +# MXNet C++ Package Examples + +## Building C++ examples + +The examples are built while building the MXNet library and cpp-package from source . However, they can be built manually as follows + +From cpp-package/examples directory + +- Build all examples in release mode: **make all** +- Build all examples in debug mode: **make debug** + +By default, the examples are build to be run on GPU. To build examples to run on CPU: + +- Release: **make all MXNET\_USE\_CPU=1** +- Debug: **make debug MXNET\_USE\_CPU=1** + +The examples that are build to be run on GPU may not work on the non-GPU machines. +The makefile will also download the necessary data files and store in data folder. (The download will take couple of minutes, but will be done only once on a fresh installation.) + + +## Examples + +This directory contains following examples. In order to run the examples, ensure that the path to the MXNet shared library is added to the OS specific environment variable such as _LD\_LIBRARY\_PATH_ . + +### [alexnet.cpp]() + +The example implements C++ version of AlexNet. The networks trains the MNIST data. The number of epochs can be specified as command line arguement. For example: + ``` + ./alexnet 10 + ``` + +### [charRNN.cpp]() + +The code implements C++ version charRNN for mxnet\example\rnn\char-rnn.ipynb with MXNet.cpp API. The generated params file is compatiable with python version. The train() and predict() has been verified with original data samples. + +The example expects arguments as follows: + +``` + ./charRNN train [BuildIn\ [TImeMajor] {corpus file} { batch size} { max epoch} [{starting epoch}] + ./charRNN predict [BuildIn\ [TImeMajor] {param file} { batch size} { max epoch} [{starting epoch}] +``` + +### [googlenet.cpp]() + +The code implements GoogLeNet/Inception network using C++ API. The example uses MNIST data to train the network. The number of epochs can be specified in the command line as follows. If not specified, the model trains for 100 epochs. + +``` +./googlenet 10 +``` + +### [mlp.cpp]() + +The code implements multilayer perceptron from scratch. The example creates its own dummy data to train the model. The example does not require command line parameters. It trains the model for 20000 iterations. + +``` +./mlp +``` + +### [mlp_cpu.cpp]() + +The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and MNISTIter. The example is designed to work on CPU. The example does not require command line parameters. + +``` +./mlp_cpu +``` + +### [mlp_gpu.cpp]() +The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and MNISTIter. The example is designed to work on GPU. The example does not require command line paratmeters. + +``` +./mlp_gpu +``` + +### [mlp_csv.cpp]() +The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and CSVIter. The CSVIter can iterate data that is in CSV format. The example can be run on CPU or GPU. The example usage is as follows: + +``` +mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --batch_size 100 --hidden_units "128,64,64 [--gpu]" +``` + +### [resnet.cpp]() + +The code implements resnet model using C++ API. The model is used to train MNIST data. The number of epochs for training the model can be specified on the command line. By default, model is trained for 100 epochs. + +``` +./resnet 10 +``` + +### [lenet.cpp]() + +The code implements lenet model using C++ API. It uses MNIST training data in CSV format to train the network. The example does not use built-in CSVIter to read the data from CSV file. The number of epochs can be specified on the command line. By default, the mode is trained for 100000 epochs. + +``` +./lenet 10 +``` +### [lenet\_with\_mxdataiter.cpp]() + +The code implements lenet model using C++ API. It uses MNIST training data to train the network. The example uses built-in MNISTIter to read the data. The number of epochs can be specified on the command line. By default, the mode is trained for 100 epochs. + +``` +./lenet\_with\_mxdataiter 10 +``` + +In addition, there is `run_lenet_with_mxdataiter.sh` that downloads the mnist data and run `lenet_with_mxdataiter` example. + +###[inception_bn.cpp]() + +The code implements Inception network using C++ API with batch normalization. The example uses MNIST data to train the network. The model trains for 100 epochs. + +``` +./inception_bn +``` From 90fbbd7364b77d8404bf6c67168ab704dba7d5cc Mon Sep 17 00:00:00 2001 From: Amol Lele <19983848+leleamol@users.noreply.github.com> Date: Thu, 4 Oct 2018 10:53:53 -0700 Subject: [PATCH 2/4] Updating the readme file for cpp-package and adding readme file for example directory. --- cpp-package/README.md | 45 ++++++++++---- cpp-package/example/README.md | 112 ++++++++++++++++++++++++++++++++++ 2 files changed, 145 insertions(+), 12 deletions(-) create mode 100644 cpp-package/example/README.md diff --git a/cpp-package/README.md b/cpp-package/README.md index 2b6e0e39f0fd..f7e48ddd7775 100644 --- a/cpp-package/README.md +++ b/cpp-package/README.md @@ -1,21 +1,42 @@ # MXNet C++ Package -To build the C++ package, please refer to [this guide](). +The MXNet C++ Package provides C++ API bindings to the users of MXNet. Currently, these bindings are not available as standalone package. +The users of these bindings are required to build this package as mentioned below. -A basic tutorial can be found at . +## Building C++ Package -The example directory contains examples for you to get started. +The cpp-package directory contains the implementation of C++ API. As mentioned above, users are required to build this directory or package before using it. +**The cpp-package is built while building the MXNet shared library, *libmxnet.so*.** + +###Steps to build the C++ package: +1. Building the MXNet C++ package requires building MXNet from source. +2. Clone the MXNet github repository **recursively** to ensure the code in submodules is available for building MXNet. +3. Install the [prerequisites](), desired [BLAS libraries]() and optional [OpenCV, CUDA, and cuDNN]() for building MXNet from source. +4. There is a configuration file for make, [make/config.mk]() that contains all the compilation options. You can edit this file and set the appropriate options prior to running the **make** command. +5. Please refer to [platfrom specific build instructions]() and available [build configurations](https://mxnet.incubator.apache.org/install/build_from_source#build-configurations) for more details. +5. For enabling the build of C++ Package, set the **USE__CPP__PACKAGE = 1** in [make/config.mk](). Optionally, the compilation flag can also be specified on **make** command line as follows: + ``` + make -j USE_CPP_PACKAGE=1 + ``` + +## Usage + +In order to consume the C++ API please follow the steps below -## Building C++ examples in examples folder +1. Ensure that the MXNet shared library is built from source with the **USE__CPP__PACKAGE = 1**. +2. Include the [MxNetCpp.h]() in the program that is going to consume MXNet C++ API. + ``` + #include + ``` +3. While building the program, ensure that the correct paths to the directories containing header files and MxNet shared library. +4. The program links MxNet shared library dynamically. Hence the library needs to be accessible to the program during the runtime. This can be achieved by including the path to shared library to environment variable such as LD_LIBRARY_PATH. -From cpp-package/examples directory -- Build all examples in release mode: **make all** -- Build all examples in debug mode : **make debug** -By default, the examples are build to be run on GPU. -To build examples to run on CPU: -- Release: **make all MXNET_USE_CPU=1** -- Debug: **make debug MXNET_USE_CPU=1** +## Tutorial +A basic tutorial can be found at . + +## Examples + +The example directory contains examples for you to get started. -The makefile will also download the necessary data files and store in data folder. (The download will take couple of minutes, but will be done only once on a fresh installation.) diff --git a/cpp-package/example/README.md b/cpp-package/example/README.md new file mode 100644 index 000000000000..7effdeb40eb7 --- /dev/null +++ b/cpp-package/example/README.md @@ -0,0 +1,112 @@ +# MXNet C++ Package Examples + +## Building C++ examples + +The examples are built while building the MXNet library and cpp-package from source . However, they can be built manually as follows + +From cpp-package/examples directory + +- Build all examples in release mode: **make all** +- Build all examples in debug mode: **make debug** + +By default, the examples are build to be run on GPU. To build examples to run on CPU: + +- Release: **make all MXNET\_USE\_CPU=1** +- Debug: **make debug MXNET\_USE\_CPU=1** + +The examples that are build to be run on GPU may not work on the non-GPU machines. +The makefile will also download the necessary data files and store in data folder. (The download will take couple of minutes, but will be done only once on a fresh installation.) + + +## Examples + +This directory contains following examples. In order to run the examples, ensure that the path to the MXNet shared library is added to the OS specific environment variable such as _LD\_LIBRARY\_PATH_ . + +### [alexnet.cpp]() + +The example implements C++ version of AlexNet. The networks trains the MNIST data. The number of epochs can be specified as command line arguement. For example: + ``` + ./alexnet 10 + ``` + +### [charRNN.cpp]() + +The code implements C++ version charRNN for mxnet\example\rnn\char-rnn.ipynb with MXNet.cpp API. The generated params file is compatiable with python version. The train() and predict() has been verified with original data samples. + +The example expects arguments as follows: + +``` + ./charRNN train [BuildIn\ [TImeMajor] {corpus file} { batch size} { max epoch} [{starting epoch}] + ./charRNN predict [BuildIn\ [TImeMajor] {param file} { batch size} { max epoch} [{starting epoch}] +``` + +### [googlenet.cpp]() + +The code implements GoogLeNet/Inception network using C++ API. The example uses MNIST data to train the network. The number of epochs can be specified in the command line as follows. If not specified, the model trains for 100 epochs. + +``` +./googlenet 10 +``` + +### [mlp.cpp]() + +The code implements multilayer perceptron from scratch. The example creates its own dummy data to train the model. The example does not require command line parameters. It trains the model for 20000 iterations. + +``` +./mlp +``` + +### [mlp_cpu.cpp]() + +The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and MNISTIter. The example is designed to work on CPU. The example does not require command line parameters. + +``` +./mlp_cpu +``` + +### [mlp_gpu.cpp]() +The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and MNISTIter. The example is designed to work on GPU. The example does not require command line paratmeters. + +``` +./mlp_gpu +``` + +### [mlp_csv.cpp]() +The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and CSVIter. The CSVIter can iterate data that is in CSV format. The example can be run on CPU or GPU. The example usage is as follows: + +``` +mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --batch_size 100 --hidden_units "128,64,64 [--gpu]" +``` + +### [resnet.cpp]() + +The code implements resnet model using C++ API. The model is used to train MNIST data. The number of epochs for training the model can be specified on the command line. By default, model is trained for 100 epochs. + +``` +./resnet 10 +``` + +### [lenet.cpp]() + +The code implements lenet model using C++ API. It uses MNIST training data in CSV format to train the network. The example does not use built-in CSVIter to read the data from CSV file. The number of epochs can be specified on the command line. By default, the mode is trained for 100000 epochs. + +``` +./lenet 10 +``` +### [lenet\_with\_mxdataiter.cpp]() + +The code implements lenet model using C++ API. It uses MNIST training data to train the network. The example uses built-in MNISTIter to read the data. The number of epochs can be specified on the command line. By default, the mode is trained for 100 epochs. + +``` +./lenet\_with\_mxdataiter 10 +``` + +In addition, there is `run_lenet_with_mxdataiter.sh` that downloads the mnist data and run `lenet_with_mxdataiter` example. + +###[inception_bn.cpp]() + +The code implements Inception network using C++ API with batch normalization. The example uses MNIST data to train the network. The model trains for 100 epochs. + +``` +./inception_bn +``` From 9ee38dede06d0a6bd6e0997a0ca00cc02c862063 Mon Sep 17 00:00:00 2001 From: Amol Lele <19983848+leleamol@users.noreply.github.com> Date: Tue, 9 Oct 2018 14:42:36 -0700 Subject: [PATCH 3/4] Addressed the review comments. --- cpp-package/README.md | 18 +++++++++------ cpp-package/example/README.md | 43 ++++++++++++++--------------------- 2 files changed, 28 insertions(+), 33 deletions(-) diff --git a/cpp-package/README.md b/cpp-package/README.md index f7e48ddd7775..c4fe63c9ec58 100644 --- a/cpp-package/README.md +++ b/cpp-package/README.md @@ -10,26 +10,30 @@ The cpp-package directory contains the implementation of C++ API. As mentioned a ###Steps to build the C++ package: 1. Building the MXNet C++ package requires building MXNet from source. -2. Clone the MXNet github repository **recursively** to ensure the code in submodules is available for building MXNet. +2. Clone the MXNet GitHub repository **recursively** to ensure the code in submodules is available for building MXNet. + ``` + git clone --recursive https://github.com/apache/incubator-mxnet mxnet + ``` + 3. Install the [prerequisites](), desired [BLAS libraries]() and optional [OpenCV, CUDA, and cuDNN]() for building MXNet from source. 4. There is a configuration file for make, [make/config.mk]() that contains all the compilation options. You can edit this file and set the appropriate options prior to running the **make** command. -5. Please refer to [platfrom specific build instructions]() and available [build configurations](https://mxnet.incubator.apache.org/install/build_from_source#build-configurations) for more details. -5. For enabling the build of C++ Package, set the **USE__CPP__PACKAGE = 1** in [make/config.mk](). Optionally, the compilation flag can also be specified on **make** command line as follows: +5. Please refer to [platform specific build instructions]() and available [build configurations](https://mxnet.incubator.apache.org/install/build_from_source#build-configurations) for more details. +5. For enabling the build of C++ Package, set the **USE\_CPP\_PACKAGE = 1** in [make/config.mk](). Optionally, the compilation flag can also be specified on **make** command line as follows. ``` make -j USE_CPP_PACKAGE=1 ``` ## Usage -In order to consume the C++ API please follow the steps below +In order to consume the C++ API please follow the steps below. -1. Ensure that the MXNet shared library is built from source with the **USE__CPP__PACKAGE = 1**. +1. Ensure that the MXNet shared library is built from source with the **USE\_CPP\_PACKAGE = 1**. 2. Include the [MxNetCpp.h]() in the program that is going to consume MXNet C++ API. ``` #include ``` -3. While building the program, ensure that the correct paths to the directories containing header files and MxNet shared library. -4. The program links MxNet shared library dynamically. Hence the library needs to be accessible to the program during the runtime. This can be achieved by including the path to shared library to environment variable such as LD_LIBRARY_PATH. +3. While building the program, ensure that the correct paths to the directories containing header files and MXNet shared library. +4. The program links the MXNet shared library dynamically. Hence the library needs to be accessible to the program during runtime. This can be achieved by including the path to the shared library in the environment variable **LD\_LIBRARY\_PATH** for Linux, Mac. and Ubuntu OS and **PATH** for Windows OS. ## Tutorial diff --git a/cpp-package/example/README.md b/cpp-package/example/README.md index 7effdeb40eb7..9186aa00cc57 100644 --- a/cpp-package/example/README.md +++ b/cpp-package/example/README.md @@ -9,40 +9,30 @@ From cpp-package/examples directory - Build all examples in release mode: **make all** - Build all examples in debug mode: **make debug** -By default, the examples are build to be run on GPU. To build examples to run on CPU: +By default, the examples are built to be run on GPU. To build examples to run on CPU: - Release: **make all MXNET\_USE\_CPU=1** - Debug: **make debug MXNET\_USE\_CPU=1** -The examples that are build to be run on GPU may not work on the non-GPU machines. -The makefile will also download the necessary data files and store in data folder. (The download will take couple of minutes, but will be done only once on a fresh installation.) +The examples that are built to be run on GPU may not work on the non-GPU machines. +The makefile will also download the necessary data files and store in a data folder. (The download will take couple of minutes, but will be done only once on a fresh installation.) ## Examples -This directory contains following examples. In order to run the examples, ensure that the path to the MXNet shared library is added to the OS specific environment variable such as _LD\_LIBRARY\_PATH_ . +This directory contains following examples. In order to run the examples, ensure that the path to the MXNet shared library is added to the OS specific environment variable viz. **LD\_LIBRARY\_PATH** for Linux, Mac and Ubuntu OS and **PATH** for Windows OS. ### [alexnet.cpp]() -The example implements C++ version of AlexNet. The networks trains the MNIST data. The number of epochs can be specified as command line arguement. For example: +The example implements the C++ version of AlexNet. The networks trains on MNIST data. The number of epochs can be specified as a command line argument. For example to train with 10 epochs use the following: + ``` ./alexnet 10 ``` -### [charRNN.cpp]() - -The code implements C++ version charRNN for mxnet\example\rnn\char-rnn.ipynb with MXNet.cpp API. The generated params file is compatiable with python version. The train() and predict() has been verified with original data samples. - -The example expects arguments as follows: - -``` - ./charRNN train [BuildIn\ [TImeMajor] {corpus file} { batch size} { max epoch} [{starting epoch}] - ./charRNN predict [BuildIn\ [TImeMajor] {param file} { batch size} { max epoch} [{starting epoch}] -``` - ### [googlenet.cpp]() -The code implements GoogLeNet/Inception network using C++ API. The example uses MNIST data to train the network. The number of epochs can be specified in the command line as follows. If not specified, the model trains for 100 epochs. +The code implements a GoogLeNet/Inception network using the C++ API. The example uses MNIST data to train the network. By default, the example trains the model for 100 epochs. The number of epochs can also be specified in the command line. For example, to train the model for 10 epochs use the following: ``` ./googlenet 10 @@ -50,7 +40,8 @@ The code implements GoogLeNet/Inception network using C++ API. The example uses ### [mlp.cpp]() -The code implements multilayer perceptron from scratch. The example creates its own dummy data to train the model. The example does not require command line parameters. It trains the model for 20000 iterations. +The code implements a multilayer perceptron from scratch. The example creates its own dummy data to train the model. The example does not require command line parameters. It trains the model for 20,000 epochs. +To run the example use the following command: ``` ./mlp @@ -58,21 +49,21 @@ The code implements multilayer perceptron from scratch. The example creates its ### [mlp_cpu.cpp]() -The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and MNISTIter. The example is designed to work on CPU. The example does not require command line parameters. - +The code implements a multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and MNISTIter. The example is designed to work on CPU. The example does not require command line parameters. +To run the example use the following command: ``` ./mlp_cpu ``` ### [mlp_gpu.cpp]() -The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and MNISTIter. The example is designed to work on GPU. The example does not require command line paratmeters. +The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of the "SimpleBind" C++ API and MNISTIter. The example is designed to work on GPU. The example does not require command line arguments. To run the example execute following command: ``` ./mlp_gpu ``` ### [mlp_csv.cpp]() -The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and CSVIter. The CSVIter can iterate data that is in CSV format. The example can be run on CPU or GPU. The example usage is as follows: +The code implements a multilayer perceptron to train the MNIST data. The code demonstrates the use of the "SimpleBind" C++ API and CSVIter. The CSVIter can iterate data that is in CSV format. The example can be run on CPU or GPU. The example usage is as follows: ``` mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --batch_size 100 --hidden_units "128,64,64 [--gpu]" @@ -80,7 +71,7 @@ mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --b ### [resnet.cpp]() -The code implements resnet model using C++ API. The model is used to train MNIST data. The number of epochs for training the model can be specified on the command line. By default, model is trained for 100 epochs. +The code implements a resnet model using the C++ API. The model is used to train MNIST data. The number of epochs for training the model can be specified on the command line. By default, model is trained for 100 epochs. For example, to train with 10 epochs use the following command: ``` ./resnet 10 @@ -88,14 +79,14 @@ The code implements resnet model using C++ API. The model is used to train MNIST ### [lenet.cpp]() -The code implements lenet model using C++ API. It uses MNIST training data in CSV format to train the network. The example does not use built-in CSVIter to read the data from CSV file. The number of epochs can be specified on the command line. By default, the mode is trained for 100000 epochs. +The code implements a lenet model using the C++ API. It uses MNIST training data in CSV format to train the network. The example does not use built-in CSVIter to read the data from CSV file. The number of epochs can be specified on the command line. By default, the mode is trained for 100,000 epochs. For example, to train with 10 epochs use the following command: ``` ./lenet 10 ``` ### [lenet\_with\_mxdataiter.cpp]() -The code implements lenet model using C++ API. It uses MNIST training data to train the network. The example uses built-in MNISTIter to read the data. The number of epochs can be specified on the command line. By default, the mode is trained for 100 epochs. +The code implements a lenet model using the C++ API. It uses MNIST training data to train the network. The example uses built-in MNISTIter to read the data. The number of epochs can be specified on the command line. By default, the mode is trained for 100 epochs. For example, to train with 10 epochs use the following command: ``` ./lenet\_with\_mxdataiter 10 @@ -105,7 +96,7 @@ In addition, there is `run_lenet_with_mxdataiter.sh` that downloads the mnist da ###[inception_bn.cpp]() -The code implements Inception network using C++ API with batch normalization. The example uses MNIST data to train the network. The model trains for 100 epochs. +The code implements an Inception network using the C++ API with batch normalization. The example uses MNIST data to train the network. The model trains for 100 epochs. The example can be run by executing the following command: ``` ./inception_bn From 00bce8c7b960842a53add0974a57c84873a3debd Mon Sep 17 00:00:00 2001 From: Amol Lele <19983848+leleamol@users.noreply.github.com> Date: Tue, 9 Oct 2018 16:41:00 -0700 Subject: [PATCH 4/4] Addressed the review comments --- cpp-package/example/README.md | 70 ++--------------------------------- 1 file changed, 3 insertions(+), 67 deletions(-) diff --git a/cpp-package/example/README.md b/cpp-package/example/README.md index 76f6a0127c0b..5d2f3b01f8f5 100644 --- a/cpp-package/example/README.md +++ b/cpp-package/example/README.md @@ -9,64 +9,30 @@ From cpp-package/examples directory - Build all examples in release mode: **make all** - Build all examples in debug mode: **make debug** -<<<<<<< HEAD By default, the examples are built to be run on GPU. To build examples to run on CPU: -======= -By default, the examples are build to be run on GPU. To build examples to run on CPU: ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 - Release: **make all MXNET\_USE\_CPU=1** - Debug: **make debug MXNET\_USE\_CPU=1** -<<<<<<< HEAD The examples that are built to be run on GPU may not work on the non-GPU machines. The makefile will also download the necessary data files and store in a data folder. (The download will take couple of minutes, but will be done only once on a fresh installation.) -======= -The examples that are build to be run on GPU may not work on the non-GPU machines. -The makefile will also download the necessary data files and store in data folder. (The download will take couple of minutes, but will be done only once on a fresh installation.) ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 ## Examples -<<<<<<< HEAD This directory contains following examples. In order to run the examples, ensure that the path to the MXNet shared library is added to the OS specific environment variable viz. **LD\_LIBRARY\_PATH** for Linux, Mac and Ubuntu OS and **PATH** for Windows OS. ### [alexnet.cpp]() The example implements the C++ version of AlexNet. The networks trains on MNIST data. The number of epochs can be specified as a command line argument. For example to train with 10 epochs use the following: -======= -This directory contains following examples. In order to run the examples, ensure that the path to the MXNet shared library is added to the OS specific environment variable such as _LD\_LIBRARY\_PATH_ . - -### [alexnet.cpp]() - -The example implements C++ version of AlexNet. The networks trains the MNIST data. The number of epochs can be specified as command line arguement. For example: ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 ``` ./alexnet 10 ``` -<<<<<<< HEAD ### [googlenet.cpp]() The code implements a GoogLeNet/Inception network using the C++ API. The example uses MNIST data to train the network. By default, the example trains the model for 100 epochs. The number of epochs can also be specified in the command line. For example, to train the model for 10 epochs use the following: -======= -### [charRNN.cpp]() - -The code implements C++ version charRNN for mxnet\example\rnn\char-rnn.ipynb with MXNet.cpp API. The generated params file is compatiable with python version. The train() and predict() has been verified with original data samples. - -The example expects arguments as follows: - -``` - ./charRNN train [BuildIn\ [TImeMajor] {corpus file} { batch size} { max epoch} [{starting epoch}] - ./charRNN predict [BuildIn\ [TImeMajor] {param file} { batch size} { max epoch} [{starting epoch}] -``` - -### [googlenet.cpp]() - -The code implements GoogLeNet/Inception network using C++ API. The example uses MNIST data to train the network. The number of epochs can be specified in the command line as follows. If not specified, the model trains for 100 epochs. ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 ``` ./googlenet 10 @@ -74,12 +40,8 @@ The code implements GoogLeNet/Inception network using C++ API. The example uses ### [mlp.cpp]() -<<<<<<< HEAD The code implements a multilayer perceptron from scratch. The example creates its own dummy data to train the model. The example does not require command line parameters. It trains the model for 20,000 epochs. To run the example use the following command: -======= -The code implements multilayer perceptron from scratch. The example creates its own dummy data to train the model. The example does not require command line parameters. It trains the model for 20000 iterations. ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 ``` ./mlp @@ -87,34 +49,24 @@ The code implements multilayer perceptron from scratch. The example creates its ### [mlp_cpu.cpp]() -<<<<<<< HEAD The code implements a multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and MNISTIter. The example is designed to work on CPU. The example does not require command line parameters. To run the example use the following command: -======= -The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and MNISTIter. The example is designed to work on CPU. The example does not require command line parameters. ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 ``` ./mlp_cpu ``` ### [mlp_gpu.cpp]() -<<<<<<< HEAD -The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of the "SimpleBind" C++ API and MNISTIter. The example is designed to work on GPU. The example does not require command line arguments. To run the example execute following command: -======= -The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and MNISTIter. The example is designed to work on GPU. The example does not require command line paratmeters. ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 + +The code implements a multilayer perceptron to train the MNIST data. The code demonstrates the use of the "SimpleBind" C++ API and MNISTIter. The example is designed to work on GPU. The example does not require command line arguments. To run the example execute following command: ``` ./mlp_gpu ``` ### [mlp_csv.cpp]() -<<<<<<< HEAD + The code implements a multilayer perceptron to train the MNIST data. The code demonstrates the use of the "SimpleBind" C++ API and CSVIter. The CSVIter can iterate data that is in CSV format. The example can be run on CPU or GPU. The example usage is as follows: -======= -The code implements multilayer perceptron to train the MNIST data. The code demonstrates the use of "SimpleBind" C++ API and CSVIter. The CSVIter can iterate data that is in CSV format. The example can be run on CPU or GPU. The example usage is as follows: ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 ``` mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --batch_size 100 --hidden_units "128,64,64 [--gpu]" @@ -122,11 +74,7 @@ mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --b ### [resnet.cpp]() -<<<<<<< HEAD The code implements a resnet model using the C++ API. The model is used to train MNIST data. The number of epochs for training the model can be specified on the command line. By default, model is trained for 100 epochs. For example, to train with 10 epochs use the following command: -======= -The code implements resnet model using C++ API. The model is used to train MNIST data. The number of epochs for training the model can be specified on the command line. By default, model is trained for 100 epochs. ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 ``` ./resnet 10 @@ -134,22 +82,14 @@ The code implements resnet model using C++ API. The model is used to train MNIST ### [lenet.cpp]() -<<<<<<< HEAD The code implements a lenet model using the C++ API. It uses MNIST training data in CSV format to train the network. The example does not use built-in CSVIter to read the data from CSV file. The number of epochs can be specified on the command line. By default, the mode is trained for 100,000 epochs. For example, to train with 10 epochs use the following command: -======= -The code implements lenet model using C++ API. It uses MNIST training data in CSV format to train the network. The example does not use built-in CSVIter to read the data from CSV file. The number of epochs can be specified on the command line. By default, the mode is trained for 100000 epochs. ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 ``` ./lenet 10 ``` ### [lenet\_with\_mxdataiter.cpp]() -<<<<<<< HEAD The code implements a lenet model using the C++ API. It uses MNIST training data to train the network. The example uses built-in MNISTIter to read the data. The number of epochs can be specified on the command line. By default, the mode is trained for 100 epochs. For example, to train with 10 epochs use the following command: -======= -The code implements lenet model using C++ API. It uses MNIST training data to train the network. The example uses built-in MNISTIter to read the data. The number of epochs can be specified on the command line. By default, the mode is trained for 100 epochs. ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 ``` ./lenet\_with\_mxdataiter 10 @@ -159,11 +99,7 @@ In addition, there is `run_lenet_with_mxdataiter.sh` that downloads the mnist da ###[inception_bn.cpp]() -<<<<<<< HEAD The code implements an Inception network using the C++ API with batch normalization. The example uses MNIST data to train the network. The model trains for 100 epochs. The example can be run by executing the following command: -======= -The code implements Inception network using C++ API with batch normalization. The example uses MNIST data to train the network. The model trains for 100 epochs. ->>>>>>> 39054b349e83ead13127cd1bd6b90e3141bc0451 ``` ./inception_bn