From 6e27a715b0e31fec1bf463aaf763f7c4488705e2 Mon Sep 17 00:00:00 2001 From: Acharya Date: Tue, 27 Mar 2018 16:34:52 -0700 Subject: [PATCH 1/6] ONNX Documentation --- docs/api/python/contrib/onnx.md | 47 +++++++++++ docs/api/python/index.md | 1 + docs/tutorials/onnx/super_resolution.md | 83 +++++++++++++++++++ python/mxnet/contrib/onnx/__init__.py | 2 +- .../contrib/onnx/_import/import_model.py | 11 ++- 5 files changed, 137 insertions(+), 7 deletions(-) create mode 100644 docs/api/python/contrib/onnx.md create mode 100644 docs/tutorials/onnx/super_resolution.md diff --git a/docs/api/python/contrib/onnx.md b/docs/api/python/contrib/onnx.md new file mode 100644 index 000000000000..6f32dd39dff0 --- /dev/null +++ b/docs/api/python/contrib/onnx.md @@ -0,0 +1,47 @@ +# ONNX-MXNet API + +## Overview + +The `mxnet.contrib.onnx` package refers to the APIs and interfaces that implements ONNX model format support for Apache MXNet. + +With ONNX format support for MXNet, developers can build and train models with PyTorch, CNTK, or Caffe2, and import these models into MXNet to run them for inference using MXNet’s highly optimized engine. + +```eval_rst +.. warning:: This package contains experimental APIs and may change in the near future. +``` + +```eval_rst +.. note:: **Install ONNX** which needs protobuf compiler to be installed separately. Please **follow the instructions to install ONNX** - https://github.com/onnx/onnx. +``` + +This document describes the ONNX APIs in mxnet. + +```eval_rst +.. autosummary:: + :nosignatures: + + mxnet.contrib.onnx.import_model +``` + +## ONNX Tutorials + +```eval_rst +.. toctree:: + :maxdepth: 1 + + /tutorials/onnx/super_resolution.md + /tutorials/onnx/inference_on_onnx_model.md +``` + +## API Reference + + + +```eval_rst + +.. automodule:: mxnet.contrib.onnx + :members: import_model + +``` + + \ No newline at end of file diff --git a/docs/api/python/index.md b/docs/api/python/index.md index f65d3abfb15f..b097e2045b14 100644 --- a/docs/api/python/index.md +++ b/docs/api/python/index.md @@ -151,4 +151,5 @@ imported by running: contrib/contrib.md contrib/text.md + contrib/onnx.md ``` diff --git a/docs/tutorials/onnx/super_resolution.md b/docs/tutorials/onnx/super_resolution.md new file mode 100644 index 000000000000..476b79a23f23 --- /dev/null +++ b/docs/tutorials/onnx/super_resolution.md @@ -0,0 +1,83 @@ +## Import ONNX model into Mxnet + +The code demonstration in this document assumes that the following packages are installed and imported. + +```python +from PIL import Image +import numpy as np +import mxnet as mx +import mxnet.contrib.onnx as onnx_mxnet +``` + +The PIL package is a Python Image Processing package and is required for input preprocessing. It can be installed as follows + +```python +pip install Pillow +``` + +We will now try to import a [Super_Resolution model](http://pytorch.org/tutorials/advanced/super_resolution_with_caffe2.html), trained with PyTorch, and run inference in MXNet. PyTorch provides a way to export models in ONNX protobuf format. Using this functionality, we have exported the model into ONNX format. + +You can download the converted ONNX model from +[here](https://s3.amazonaws.com/onnx-mxnet/examples/super_resolution.onnx). + +A pre-trained model in MXNet contains two elements: a symbolic graph, containing the model's network definition, and a binary file containing the model weights. You can import the ONNX model and get the symbol and parameters objects using "import_model" API as shown below: + +```python +sym, params = onnx_mxnet.import_model('super_resolution.onnx') +``` + +To run inference on the imported mxnet model, you need to use MXNet's [Module API](https://mxnet.incubator.apache.org/api/python/module.html), following these steps: + +- Input image preprocessing + +For the input image pre-process step, we will download and transform the image into an input tensor: + +```python +img_url = 'https://s3.amazonaws.com/onnx-mxnet/examples/super_res_input.jpg' +download(img_url, 'super_res_input.jpg') +img = Image.open('super_res_input.jpg').resize((224, 224)) +img_ycbcr = img.convert("YCbCr") +img_y, img_cb, img_cr = img_ycbcr.split() +test_image = np.array(img_y)[np.newaxis, np.newaxis, :, :] +``` +- We'll be using MXNet's Module API to create the module, bind it and assign the loaded weights. + +``` +# By default, 'input_0' is an input of the imported model. +mod = mx.mod.Module(symbol=sym, data_names=['input_0'], context=mx.cpu(), label_names=None) +mod.bind(for_training=False, data_shapes=[('input_0',test_image.shape)], label_shapes=None) +mod.set_params(arg_params=params, aux_params=params, allow_missing=True, allow_extra=True) +``` + +- Run inference +``` +# Forward method needs Batch of data as input +from collections import namedtuple +Batch = namedtuple('Batch', ['data']) + +# forward on the provided data batch +mod.forward(Batch([mx.nd.array(test_image)])) +``` + +- To get the output of previous forward computation, use "module.get_outputs()" method. +It returns ndarray that we convert to numpy array, create and save the super resolution image: +``` +output = mod.get_outputs()[0][0][0] +img_out_y = Image.fromarray(np.uint8((output.asnumpy().clip(0, 255)), mode='L')) +result_img = Image.merge( +"YCbCr", [ + img_out_y, + img_cb.resize(img_out_y.size, Image.BICUBIC), + img_cr.resize(img_out_y.size, Image.BICUBIC) +]).convert("RGB") +result_img.save("super_res_output.jpg") + +``` + +Here's the input image and the resulting output images compared. As you can see, the model was able to increase the spatial resolution from 256x256 to 672x672. + +| Input Image | Output Image | +| ----------- | ------------ | +| ![input](https://s3.amazonaws.com/onnx-mxnet/examples/super_res_input.jpg) | ![output](https://s3.amazonaws.com/onnx-mxnet/examples/super_res_expected_output.jpg) | + + \ No newline at end of file diff --git a/python/mxnet/contrib/onnx/__init__.py b/python/mxnet/contrib/onnx/__init__.py index eff91206298f..169ac673455c 100644 --- a/python/mxnet/contrib/onnx/__init__.py +++ b/python/mxnet/contrib/onnx/__init__.py @@ -15,6 +15,6 @@ # specific language governing permissions and limitations # under the License. -"""Module for importing and exporting ONNX models.""" +"""Module for ONNX model format support for Apache MXNet.""" from ._import.import_model import import_model diff --git a/python/mxnet/contrib/onnx/_import/import_model.py b/python/mxnet/contrib/onnx/_import/import_model.py index d8d32a96a216..c1a26ea4d6e0 100644 --- a/python/mxnet/contrib/onnx/_import/import_model.py +++ b/python/mxnet/contrib/onnx/_import/import_model.py @@ -22,7 +22,7 @@ from .import_onnx import GraphProto def import_model(model_file): - """Imports the ONNX model file passed as a parameter into MXNet symbol and parameters. + """Imports the ONNX model file, passed as a parameter, into MXNet symbol and parameters. Parameters ---------- @@ -31,12 +31,11 @@ def import_model(model_file): Returns ------- - Mxnet symbol and parameter objects. + sym : :class:`~mxnet.symbol.Symbol` + MXNet symbol object - sym : mxnet.symbol - Mxnet symbol - params : dict of str to mx.ndarray - Dict of converted parameters stored in mxnet.ndarray format + params : dict of ``str`` to :class:`~mxnet.ndarray.NDArray` + Dict of converted parameters stored in ``mxnet.ndarray.NDArray`` format """ graph = GraphProto() From ea11c485c59b6a69e20a089e4d044bdc8cb84fc7 Mon Sep 17 00:00:00 2001 From: Acharya Date: Wed, 28 Mar 2018 16:01:27 -0700 Subject: [PATCH 2/6] Fix rendering issues. --- docs/api/python/contrib/onnx.md | 6 +- docs/tutorials/index.md | 2 + .../tutorials/onnx/inference_on_onnx_model.md | 1 - docs/tutorials/onnx/super_resolution.md | 126 +++++++++++++----- .../contrib/onnx/_import/import_model.py | 10 +- .../mxnet/contrib/onnx/_import/import_onnx.py | 5 +- 6 files changed, 110 insertions(+), 40 deletions(-) diff --git a/docs/api/python/contrib/onnx.md b/docs/api/python/contrib/onnx.md index 6f32dd39dff0..35abd6ee9aae 100644 --- a/docs/api/python/contrib/onnx.md +++ b/docs/api/python/contrib/onnx.md @@ -2,16 +2,18 @@ ## Overview +[ONNX](https://onnx.ai/) is an open format to represent deep learning models. With ONNX as an intermediate representation, it is easier to move models between state-of-the-art tools and frameworks for training and inference.. + The `mxnet.contrib.onnx` package refers to the APIs and interfaces that implements ONNX model format support for Apache MXNet. -With ONNX format support for MXNet, developers can build and train models with PyTorch, CNTK, or Caffe2, and import these models into MXNet to run them for inference using MXNet’s highly optimized engine. +With ONNX format support for MXNet, developers can build and train models with PyTorch, CNTK, or Caffe2, and import these models into MXNet to run them for inference and training using MXNet’s highly optimized engine. ```eval_rst .. warning:: This package contains experimental APIs and may change in the near future. ``` ```eval_rst -.. note:: **Install ONNX** which needs protobuf compiler to be installed separately. Please **follow the instructions to install ONNX** - https://github.com/onnx/onnx. +.. note:: To use this module developers need to **install ONNX**, which needs protobuf compiler, to be installed separately. Please follow the [instructions to install ONNX](https://github.com/onnx/onnx) ``` This document describes the ONNX APIs in mxnet. diff --git a/docs/tutorials/index.md b/docs/tutorials/index.md index 8a597e95bfb7..ff767064d7c9 100644 --- a/docs/tutorials/index.md +++ b/docs/tutorials/index.md @@ -188,6 +188,8 @@ The Gluon and Module tutorials are in Python, but you can also find a variety of - [Text classification (NLP) on Movie Reviews](http://mxnet.incubator.apache.org/tutorials/nlp/cnn.html) +- [Importing an ONNX model into MXNet](http://mxnet.incubator.apache.org/tutorials/onnx/super_resolution.html) + diff --git a/docs/tutorials/onnx/inference_on_onnx_model.md b/docs/tutorials/onnx/inference_on_onnx_model.md index 182a2ae74cde..2b64945e4e9a 100644 --- a/docs/tutorials/onnx/inference_on_onnx_model.md +++ b/docs/tutorials/onnx/inference_on_onnx_model.md @@ -14,7 +14,6 @@ In this tutorial we will: To run the tutorial you will need to have installed the following python modules: - [MXNet](http://mxnet.incubator.apache.org/install/index.html) - [onnx](https://github.com/onnx/onnx) (follow the install guide) -- [onnx-mxnet](https://github.com/onnx/onnx-mxnet) - matplotlib - wget diff --git a/docs/tutorials/onnx/super_resolution.md b/docs/tutorials/onnx/super_resolution.md index 476b79a23f23..b30e04648001 100644 --- a/docs/tutorials/onnx/super_resolution.md +++ b/docs/tutorials/onnx/super_resolution.md @@ -1,57 +1,88 @@ -## Import ONNX model into Mxnet -The code demonstration in this document assumes that the following packages are installed and imported. +# Importing an ONNX model into MXNet + +In this tutorial we will: + +- learn how to load a pre-trained ONNX model file into MXNet. +- run inference in MXNet. + +## Pre-requisite +The code demonstration assumes that the following python packages are installed: +- [MXNet](http://mxnet.incubator.apache.org/install/index.html) +- [onnx](https://github.com/onnx/onnx) (follow the install guide) +- Pillow - A Python Image Processing package and is required for input pre-processing. It can be installed with ```pip install Pillow```. +- matplotlib + ```python from PIL import Image import numpy as np import mxnet as mx import mxnet.contrib.onnx as onnx_mxnet +from mxnet.test_utils import download +from matplotlib.pyplot import imshow ``` -The PIL package is a Python Image Processing package and is required for input preprocessing. It can be installed as follows +### Fetching the required files + ```python -pip install Pillow +img_url = 'https://s3.amazonaws.com/onnx-mxnet/examples/super_res_input.jpg' +download(img_url, 'super_res_input.jpg') +model_url = 'https://s3.amazonaws.com/onnx-mxnet/examples/super_resolution.onnx' +onnx_model_file = download(model_url, 'super_resolution.onnx') ``` -We will now try to import a [Super_Resolution model](http://pytorch.org/tutorials/advanced/super_resolution_with_caffe2.html), trained with PyTorch, and run inference in MXNet. PyTorch provides a way to export models in ONNX protobuf format. Using this functionality, we have exported the model into ONNX format. +## Loading the model into MXNet -You can download the converted ONNX model from -[here](https://s3.amazonaws.com/onnx-mxnet/examples/super_resolution.onnx). +To completely describe a pre-trained model in MXNet, we need two elements: a symbolic graph, containing the model's network definition, and a binary file containing the model weights. You can import the ONNX model and get the symbol and parameters objects using "import_model" API. The paameter object is split into argument parameters and auxilliary parameters. -A pre-trained model in MXNet contains two elements: a symbolic graph, containing the model's network definition, and a binary file containing the model weights. You can import the ONNX model and get the symbol and parameters objects using "import_model" API as shown below: ```python -sym, params = onnx_mxnet.import_model('super_resolution.onnx') +sym, arg, aux = onnx_mxnet.import_model(onnx_model_file) ``` -To run inference on the imported mxnet model, you need to use MXNet's [Module API](https://mxnet.incubator.apache.org/api/python/module.html), following these steps: +We can now visualize the imported model( graphviz needs to be installed) + + +```python +mx.viz.plot_network(sym, node_attrs={"shape":"oval","fixedsize":"false"}) +``` + + + + +![svg](output_8_0.svg) -- Input image preprocessing -For the input image pre-process step, we will download and transform the image into an input tensor: + +## Input Pre-processing + +We will transform the previously downloaded input image into an input tensor. + ```python -img_url = 'https://s3.amazonaws.com/onnx-mxnet/examples/super_res_input.jpg' -download(img_url, 'super_res_input.jpg') img = Image.open('super_res_input.jpg').resize((224, 224)) img_ycbcr = img.convert("YCbCr") img_y, img_cb, img_cr = img_ycbcr.split() test_image = np.array(img_y)[np.newaxis, np.newaxis, :, :] ``` -- We'll be using MXNet's Module API to create the module, bind it and assign the loaded weights. -``` -# By default, 'input_0' is an input of the imported model. +## Run Inference using MXNet's Module API + +We will use MXNet's Module API to run the inference. For this we will need to create the module, bind it to the input data and assign the loaded weights from the two parameter objects - argument parameters and auxilliary parameters. + + +```python mod = mx.mod.Module(symbol=sym, data_names=['input_0'], context=mx.cpu(), label_names=None) mod.bind(for_training=False, data_shapes=[('input_0',test_image.shape)], label_shapes=None) -mod.set_params(arg_params=params, aux_params=params, allow_missing=True, allow_extra=True) +mod.set_params(arg_params=arg, aux_params=aux, allow_missing=True, allow_extra=True) ``` -- Run inference -``` -# Forward method needs Batch of data as input +Module API's forward method requires Batch of data as input. We will prepare the data in that format and feed it to the forward method. + + +```python from collections import namedtuple Batch = namedtuple('Batch', ['data']) @@ -59,25 +90,56 @@ Batch = namedtuple('Batch', ['data']) mod.forward(Batch([mx.nd.array(test_image)])) ``` -- To get the output of previous forward computation, use "module.get_outputs()" method. -It returns ndarray that we convert to numpy array, create and save the super resolution image: -``` +To get the output of previous forward computation, you use ``module.get_outputs()`` method. +It returns an ``ndarray`` that we convert to a ``numpy`` array and then to Pillow's image format + + +```python output = mod.get_outputs()[0][0][0] img_out_y = Image.fromarray(np.uint8((output.asnumpy().clip(0, 255)), mode='L')) result_img = Image.merge( "YCbCr", [ - img_out_y, - img_cb.resize(img_out_y.size, Image.BICUBIC), - img_cr.resize(img_out_y.size, Image.BICUBIC) + img_out_y, + img_cb.resize(img_out_y.size, Image.BICUBIC), + img_cr.resize(img_out_y.size, Image.BICUBIC) ]).convert("RGB") -result_img.save("super_res_output.jpg") +``` + +### Input Image + +```python +imshow(np.asarray(img)) ``` -Here's the input image and the resulting output images compared. As you can see, the model was able to increase the spatial resolution from 256x256 to 672x672. -| Input Image | Output Image | -| ----------- | ------------ | -| ![input](https://s3.amazonaws.com/onnx-mxnet/examples/super_res_input.jpg) | ![output](https://s3.amazonaws.com/onnx-mxnet/examples/super_res_expected_output.jpg) | + + + + + + + +![png](output_20_1.png) + + +### Output Image + +The model was able to increase the spatial resolution of the input image from 256x256 to 672x672. + + +```python +imshow(np.asarray(result_img)) +``` + + + + + + + + + +![png](output_22_1.png) \ No newline at end of file diff --git a/python/mxnet/contrib/onnx/_import/import_model.py b/python/mxnet/contrib/onnx/_import/import_model.py index c1a26ea4d6e0..211592eab8e7 100644 --- a/python/mxnet/contrib/onnx/_import/import_model.py +++ b/python/mxnet/contrib/onnx/_import/import_model.py @@ -23,6 +23,7 @@ def import_model(model_file): """Imports the ONNX model file, passed as a parameter, into MXNet symbol and parameters. + Operator support and coverage - https://cwiki.apache.org/confluence/display/MXNET/ONNX Parameters ---------- @@ -34,16 +35,19 @@ def import_model(model_file): sym : :class:`~mxnet.symbol.Symbol` MXNet symbol object - params : dict of ``str`` to :class:`~mxnet.ndarray.NDArray` + arg_params : dict of ``str`` to :class:`~mxnet.ndarray.NDArray` + Dict of converted parameters stored in ``mxnet.ndarray.NDArray`` format + + aux_params : dict of ``str`` to :class:`~mxnet.ndarray.NDArray` Dict of converted parameters stored in ``mxnet.ndarray.NDArray`` format """ graph = GraphProto() - # loads model file and returns ONNX protobuf object try: import onnx except ImportError: - raise ImportError("Onnx and protobuf need to be installed") + raise ImportError("Onnx and protobuf need to be installed. Instructions to install - https://github.com/onnx/onnx") + # loads model file and returns ONNX protobuf object model_proto = onnx.load(model_file) sym, arg_params, aux_params = graph.from_onnx(model_proto.graph) return sym, arg_params, aux_params diff --git a/python/mxnet/contrib/onnx/_import/import_onnx.py b/python/mxnet/contrib/onnx/_import/import_onnx.py index 037790c80806..34d6e13b8849 100644 --- a/python/mxnet/contrib/onnx/_import/import_onnx.py +++ b/python/mxnet/contrib/onnx/_import/import_onnx.py @@ -147,8 +147,9 @@ def _parse_array(self, tensor_proto): """Grab data in TensorProto and convert to numpy array.""" try: from onnx.numpy_helper import to_array - except ImportError as e: - raise ImportError("Unable to import onnx which is required {}".format(e)) + except ImportError: + raise ImportError("Onnx and protobuf need to be installed." + + "Instructions to install - https://github.com/onnx/onnx") np_array = to_array(tensor_proto).reshape(tuple(tensor_proto.dims)) return nd.array(np_array) From 1bdb64c323b6e1dcb95af0db3a2b622b8d080cba Mon Sep 17 00:00:00 2001 From: Acharya Date: Wed, 28 Mar 2018 16:26:23 -0700 Subject: [PATCH 3/6] lint issues --- docs/tutorials/onnx/super_resolution.md | 7 +++---- python/mxnet/contrib/onnx/_import/import_model.py | 3 ++- python/mxnet/contrib/onnx/_import/import_onnx.py | 2 +- 3 files changed, 6 insertions(+), 6 deletions(-) diff --git a/docs/tutorials/onnx/super_resolution.md b/docs/tutorials/onnx/super_resolution.md index b30e04648001..19fabc9eca60 100644 --- a/docs/tutorials/onnx/super_resolution.md +++ b/docs/tutorials/onnx/super_resolution.md @@ -1,4 +1,3 @@ - # Importing an ONNX model into MXNet In this tutorial we will: @@ -52,7 +51,7 @@ mx.viz.plot_network(sym, node_attrs={"shape":"oval","fixedsize":"false"}) -![svg](output_8_0.svg) +![svg](https://s3.amazonaws.com/onnx-mxnet/examples/output_8_0.svg) @@ -120,7 +119,7 @@ imshow(np.asarray(img)) -![png](output_20_1.png) +![png](https://s3.amazonaws.com/onnx-mxnet/examples/output_20_1.png) ### Output Image @@ -140,6 +139,6 @@ imshow(np.asarray(result_img)) -![png](output_22_1.png) +![png](https://s3.amazonaws.com/onnx-mxnet/examples/output_22_1.png) \ No newline at end of file diff --git a/python/mxnet/contrib/onnx/_import/import_model.py b/python/mxnet/contrib/onnx/_import/import_model.py index 211592eab8e7..1bd4b418bc35 100644 --- a/python/mxnet/contrib/onnx/_import/import_model.py +++ b/python/mxnet/contrib/onnx/_import/import_model.py @@ -46,7 +46,8 @@ def import_model(model_file): try: import onnx except ImportError: - raise ImportError("Onnx and protobuf need to be installed. Instructions to install - https://github.com/onnx/onnx") + raise ImportError("Onnx and protobuf need to be installed. " + + "Instructions to install - https://github.com/onnx/onnx") # loads model file and returns ONNX protobuf object model_proto = onnx.load(model_file) sym, arg_params, aux_params = graph.from_onnx(model_proto.graph) diff --git a/python/mxnet/contrib/onnx/_import/import_onnx.py b/python/mxnet/contrib/onnx/_import/import_onnx.py index 34d6e13b8849..92e7cb9c64e8 100644 --- a/python/mxnet/contrib/onnx/_import/import_onnx.py +++ b/python/mxnet/contrib/onnx/_import/import_onnx.py @@ -148,7 +148,7 @@ def _parse_array(self, tensor_proto): try: from onnx.numpy_helper import to_array except ImportError: - raise ImportError("Onnx and protobuf need to be installed." + raise ImportError("Onnx and protobuf need to be installed. " + "Instructions to install - https://github.com/onnx/onnx") np_array = to_array(tensor_proto).reshape(tuple(tensor_proto.dims)) return nd.array(np_array) From 9c3c03c9dbb399febdc4927c273bf7a5ad461c04 Mon Sep 17 00:00:00 2001 From: Acharya Date: Wed, 28 Mar 2018 18:25:24 -0700 Subject: [PATCH 4/6] fixing image links --- docs/api/python/contrib/onnx.md | 2 +- docs/tutorials/onnx/super_resolution.md | 45 ++++--------------------- 2 files changed, 8 insertions(+), 39 deletions(-) diff --git a/docs/api/python/contrib/onnx.md b/docs/api/python/contrib/onnx.md index 35abd6ee9aae..efad386aa264 100644 --- a/docs/api/python/contrib/onnx.md +++ b/docs/api/python/contrib/onnx.md @@ -13,7 +13,7 @@ With ONNX format support for MXNet, developers can build and train models with P ``` ```eval_rst -.. note:: To use this module developers need to **install ONNX**, which needs protobuf compiler, to be installed separately. Please follow the [instructions to install ONNX](https://github.com/onnx/onnx) +.. note:: To use this module developers need to **install ONNX**, which needs protobuf compiler, to be installed separately. Please follow the instructions to install ONNX - https://github.com/onnx/onnx ``` This document describes the ONNX APIs in mxnet. diff --git a/docs/tutorials/onnx/super_resolution.md b/docs/tutorials/onnx/super_resolution.md index 19fabc9eca60..55fb15e6c53a 100644 --- a/docs/tutorials/onnx/super_resolution.md +++ b/docs/tutorials/onnx/super_resolution.md @@ -7,7 +7,7 @@ In this tutorial we will: ## Pre-requisite The code demonstration assumes that the following python packages are installed: -- [MXNet](http://mxnet.incubator.apache.org/install/index.html) +- [mxnet](http://mxnet.incubator.apache.org/install/index.html) - [onnx](https://github.com/onnx/onnx) (follow the install guide) - Pillow - A Python Image Processing package and is required for input pre-processing. It can be installed with ```pip install Pillow```. - matplotlib @@ -34,7 +34,7 @@ onnx_model_file = download(model_url, 'super_resolution.onnx') ## Loading the model into MXNet -To completely describe a pre-trained model in MXNet, we need two elements: a symbolic graph, containing the model's network definition, and a binary file containing the model weights. You can import the ONNX model and get the symbol and parameters objects using "import_model" API. The paameter object is split into argument parameters and auxilliary parameters. +To completely describe a pre-trained model in MXNet, we need two elements: a symbolic graph, containing the model's network definition, and a binary file containing the model weights. You can import the ONNX model and get the symbol and parameters objects using ``import_model`` API. The paameter object is split into argument parameters and auxilliary parameters. ```python @@ -51,7 +51,7 @@ mx.viz.plot_network(sym, node_attrs={"shape":"oval","fixedsize":"false"}) -![svg](https://s3.amazonaws.com/onnx-mxnet/examples/output_8_0.svg) +![svg](https://s3.amazonaws.com/onnx-mxnet/examples/super_res_mxnet_model.png) @@ -104,41 +104,10 @@ result_img = Image.merge( ]).convert("RGB") ``` -### Input Image +Here's the input image and the resulting output images compared. As you can see, the model was able to increase the spatial resolution from ``256x256`` to ``672x672``. - -```python -imshow(np.asarray(img)) -``` - - - - - - - - - -![png](https://s3.amazonaws.com/onnx-mxnet/examples/output_20_1.png) - - -### Output Image - -The model was able to increase the spatial resolution of the input image from 256x256 to 672x672. - - -```python -imshow(np.asarray(result_img)) -``` - - - - - - - - - -![png](https://s3.amazonaws.com/onnx-mxnet/examples/output_22_1.png) +| Input Image | Output Image | +| ----------- | ------------ | +| ![input](https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/doc/tutorials/onnx/images/super_res_input.jpg?raw=true) | ![output](https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/doc/tutorials/onnx/images/super_res_output.jpg?raw=true) | \ No newline at end of file From d70407ade47b43164d9ab885419aa6106b421c36 Mon Sep 17 00:00:00 2001 From: Acharya Date: Thu, 29 Mar 2018 14:12:02 -0700 Subject: [PATCH 5/6] Fix Aaron's comments --- docs/api/python/contrib/onnx.md | 14 +++++++------- docs/tutorials/onnx/super_resolution.md | 9 +++++---- 2 files changed, 12 insertions(+), 11 deletions(-) diff --git a/docs/api/python/contrib/onnx.md b/docs/api/python/contrib/onnx.md index efad386aa264..2a8f1fcea0a6 100644 --- a/docs/api/python/contrib/onnx.md +++ b/docs/api/python/contrib/onnx.md @@ -2,21 +2,21 @@ ## Overview -[ONNX](https://onnx.ai/) is an open format to represent deep learning models. With ONNX as an intermediate representation, it is easier to move models between state-of-the-art tools and frameworks for training and inference.. +[ONNX](https://onnx.ai/) is an open format to represent deep learning models. With ONNX as an intermediate representation, it is easier to move models between state-of-the-art tools and frameworks for training and inference. -The `mxnet.contrib.onnx` package refers to the APIs and interfaces that implements ONNX model format support for Apache MXNet. +The `mxnet.contrib.onnx` package refers to the APIs and interfaces that implement ONNX model format support for Apache MXNet. -With ONNX format support for MXNet, developers can build and train models with PyTorch, CNTK, or Caffe2, and import these models into MXNet to run them for inference and training using MXNet’s highly optimized engine. +With ONNX format support for MXNet, developers can build and train models with a [variety of deep learning frameworks](http://onnx.ai/supported-tools), and import these models into MXNet to run them for inference and training using MXNet’s highly optimized engine. ```eval_rst .. warning:: This package contains experimental APIs and may change in the near future. ``` -```eval_rst -.. note:: To use this module developers need to **install ONNX**, which needs protobuf compiler, to be installed separately. Please follow the instructions to install ONNX - https://github.com/onnx/onnx -``` +### Installation Instructions +- To use this module developers need to **install ONNX**, which requires protobuf compiler to be installed separately. Please follow the [instructions to install ONNX and its dependencies](https://github.com/onnx/onnx#installation). Once installed, you can go through the tutorials on how to use this module. + -This document describes the ONNX APIs in mxnet. +This document describes all the ONNX-MXNet APIs. ```eval_rst .. autosummary:: diff --git a/docs/tutorials/onnx/super_resolution.md b/docs/tutorials/onnx/super_resolution.md index 55fb15e6c53a..dc75b6606f20 100644 --- a/docs/tutorials/onnx/super_resolution.md +++ b/docs/tutorials/onnx/super_resolution.md @@ -5,8 +5,8 @@ In this tutorial we will: - learn how to load a pre-trained ONNX model file into MXNet. - run inference in MXNet. -## Pre-requisite -The code demonstration assumes that the following python packages are installed: +## Prerequisites +This example assumes that the following python packages are installed: - [mxnet](http://mxnet.incubator.apache.org/install/index.html) - [onnx](https://github.com/onnx/onnx) (follow the install guide) - Pillow - A Python Image Processing package and is required for input pre-processing. It can be installed with ```pip install Pillow```. @@ -41,7 +41,7 @@ To completely describe a pre-trained model in MXNet, we need two elements: a sym sym, arg, aux = onnx_mxnet.import_model(onnx_model_file) ``` -We can now visualize the imported model( graphviz needs to be installed) +We can now visualize the imported model (graphviz needs to be installed) ```python @@ -78,7 +78,7 @@ mod.bind(for_training=False, data_shapes=[('input_0',test_image.shape)], label_s mod.set_params(arg_params=arg, aux_params=aux, allow_missing=True, allow_extra=True) ``` -Module API's forward method requires Batch of data as input. We will prepare the data in that format and feed it to the forward method. +Module API's forward method requires batch of data as input. We will prepare the data in that format and feed it to the forward method. ```python @@ -102,6 +102,7 @@ result_img = Image.merge( img_cb.resize(img_out_y.size, Image.BICUBIC), img_cr.resize(img_out_y.size, Image.BICUBIC) ]).convert("RGB") +result_img.save("super_res_output.jpg") ``` Here's the input image and the resulting output images compared. As you can see, the model was able to increase the spatial resolution from ``256x256`` to ``672x672``. From 559e17e44eaaa8e51fd93ed47ecff08ba53b4591 Mon Sep 17 00:00:00 2001 From: Acharya Date: Thu, 29 Mar 2018 18:12:59 -0700 Subject: [PATCH 6/6] rerun CI due to unrelated failures.