Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
487cee0
graph partitioning
zhiics Aug 13, 2019
e0877e8
extern op coloring infra
zhiics Aug 23, 2019
1b2041d
eliminate redundant subgraph annotation
zhiics Aug 25, 2019
f7e299b
A failing example
zhiics Aug 29, 2019
60004d1
Refine partition algorithm
comaniac Aug 30, 2019
fe3c486
Support multiple subgraphs (runtime not work)
comaniac Aug 31, 2019
8734d0a
Support multiple function body nodes
comaniac Aug 31, 2019
590996e
Add a hack for multiple subgraphes
zhiics Sep 2, 2019
5a6f955
Add rest node visiting to propogate subgraphs.
comaniac Sep 3, 2019
2ffc785
cblas template
zhiics Sep 6, 2019
c8e1499
make Cblas working and refactor contrib codegen
comaniac Sep 10, 2019
7eef6f1
small fix for style and check handle_ before closing it
zhiics Sep 10, 2019
001f9c6
refactor the interface for different data types
comaniac Sep 11, 2019
7febcdb
add MKLDNN support and refine interface
comaniac Sep 18, 2019
47517af
Simplify runtime invoke
comaniac Sep 20, 2019
66779ec
to vm: add an InvokeExternal Instruction
zhiics Sep 23, 2019
a7380a1
refactor backend interface and remove cblas
comaniac Sep 25, 2019
d86a5f7
To vm: enalbe multiple function compilation
zhiics Sep 30, 2019
f4c55a5
enable vm test for subgraph with multiple nodes
zhiics Oct 2, 2019
5f8ecaf
fix lint
zhiics Oct 9, 2019
c7c74c3
remove get lib path API
comaniac Oct 9, 2019
5d5c907
initial commit tutorial
comaniac Oct 17, 2019
0dcc5d0
add annotation to tutorial
comaniac Oct 18, 2019
70d1e33
Refine tutorial a bit
zhiics Oct 28, 2019
125b28f
rebase to upstream
zhiics Oct 29, 2019
a298f9c
Improve:
zhiics Oct 30, 2019
7891208
change macro to function to reduce .so size
comaniac Oct 31, 2019
b8f1db0
create attr const string for Function
zhiics Nov 1, 2019
2736619
rebase to upstream
zhiics Nov 1, 2019
d950234
return csourcemodule from external codegen
zhiics Nov 25, 2019
6fb0605
fix test and clean code
comaniac Nov 27, 2019
7beb0e3
fix ci
comaniac Nov 27, 2019
5ff7fa6
more cleanup
zhiics Nov 28, 2019
cf7ac3c
fix typo
comaniac Dec 1, 2019
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -252,6 +252,7 @@ include(cmake/modules/LLVM.cmake)
include(cmake/modules/Micro.cmake)
include(cmake/modules/ANTLR.cmake)
include(cmake/modules/contrib/BLAS.cmake)
include(cmake/modules/contrib/Extern.cmake)
include(cmake/modules/contrib/Random.cmake)
include(cmake/modules/contrib/MicroStandaloneRuntime.cmake)
include(cmake/modules/contrib/Sort.cmake)
Expand Down
5 changes: 5 additions & 0 deletions cmake/config.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -163,6 +163,11 @@ set(USE_ROCBLAS OFF)
# Whether use contrib sort
set(USE_SORT ON)

# Whether use contrib extern (use ";" to separate multiple externs)
# Available externs:
# dnnl
set(USE_EXTERN none)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given the number of external compilers and runtimes we may have, I think it's better for each one to have its own field in config.cmake. For example, USE_GCC or USE_DNNL. With this, we can be more flexible with our options for finding the compiler / runtime. For example, USE_GCC=ON vs USE_GCC=<path to gcc>

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think they should be prefixed as USE_EXTERNAL_ to make it clear this is part of the external integration if we go down that path.


# Build ANTLR parser for Relay text format
# Possible values:
# - ON: enable ANTLR by searching default locations (cmake find_program for antlr4 and /usr/local for jar)
Expand Down
34 changes: 34 additions & 0 deletions cmake/modules/contrib/Extern.cmake
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

message(STATUS "Build with relay.backend.contrib")

file(GLOB GCC_RELAY_CONTRIB_SRC src/relay/backend/contrib/gcc/codegen.cc)
list(APPEND COMPILER_SRCS ${GCC_RELAY_CONTRIB_SRC})

list(FIND USE_EXTERN "dnnl" DNNL_IDX)
if(DNNL_IDX GREATER -1)
file(GLOB DNNL_RELAY_CONTRIB_SRC src/relay/backend/contrib/dnnl/codegen.cc)
list(APPEND COMPILER_SRCS ${DNNL_RELAY_CONTRIB_SRC})

find_library(EXTERN_LIBRARY_DNNL dnnl)
list(APPEND TVM_RUNTIME_LINKER_LIBS ${EXTERN_LIBRARY_DNNL})
file(GLOB DNNL_CONTRIB_SRC src/runtime/contrib/dnnl/*)
list(APPEND RUNTIME_SRCS ${DNNL_CONTRIB_SRC})
message(STATUS "Use extern library: MKLDNN" ${EXTERN_LIBRARY_DNNL})
endif()

13 changes: 13 additions & 0 deletions include/tvm/relay/attrs/annotation.h
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,19 @@ struct CastHintAttrs : public tvm::AttrsNode<CastHintAttrs> {
}
};

/*!
* \brief Options for the subgraph operators.
*/
struct SubgraphAttrs : public tvm::AttrsNode<SubgraphAttrs> {
/*! \brief The 3rd party compiler for subgraph code generation. */
std::string compiler;

TVM_DECLARE_ATTRS(SubgraphAttrs, "relay.attrs.SubgraphAttrs") {
TVM_ATTR_FIELD(compiler)
.describe("The 3rd compiler used for subgraph code generation.");
}
};

} // namespace relay
} // namespace tvm
#endif // TVM_RELAY_ATTRS_ANNOTATION_H_
27 changes: 27 additions & 0 deletions include/tvm/relay/expr.h
Original file line number Diff line number Diff line change
Expand Up @@ -268,6 +268,14 @@ class FunctionNode : public ExprNode {
*/
bool IsPrimitive() const;

/*!
* \brief Check whether the function is an external function.
* External functions are subgraphes that supported by external libraries.
*
* \return Whether the function is external or not.
*/
bool IsExternal() const;

TVM_DLL static Function make(tvm::Array<Var> params,
Expr body,
Type ret_type,
Expand Down Expand Up @@ -588,6 +596,25 @@ std::string AsText(const NodeRef& node,
bool show_meta_data = true,
runtime::TypedPackedFunc<std::string(Expr)> annotate = nullptr);

/*! \brief namespace of the attributes that are attached to a function. */
namespace attr {
/*! \brief Mark the function as a primitive function. */
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do we mean by a "primitive" function ?

constexpr const char* kPrimitive = "Primitive";
/*!
* \brief Mark the function as an external function that needs to be handled by
* the external codegen tool/backend.
*/
constexpr const char* kExternal = "External";
/*! \brief Indicate if the function is a closure. */
constexpr const char* kClosure = "Closure";
/*! \brief Store a Var to parameter/Constant mapping on a Function. */
constexpr const char* kParams = "__params__";
/*! \brief Store the function name. */
constexpr const char* kFuncName = "FuncName";
/*! \brief Mark if the function should be avoided being optimized. */
constexpr const char* kSkipOptimization = "SkipOptimization";
} // namespace attr

} // namespace relay
} // namespace tvm
#endif // TVM_RELAY_EXPR_H_
23 changes: 20 additions & 3 deletions include/tvm/relay/op_attr_types.h
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@
#include <tvm/build_module.h>
#include <tvm/relay/type.h>
#include <tvm/relay/expr.h>
#include <string>

namespace tvm {
namespace relay {
Expand Down Expand Up @@ -122,7 +123,7 @@ using FTVMSchedule = runtime::TypedPackedFunc<
* operator with other expressions. This function will be invoked
* in AlterOpLayout pass.
* \param attrs The attribute of the original node.
* \param inputs The input symbols of the original node.
* \param args The input symbols of the original node.
* \param tinfos An array of placeholders, use for getting the inferred shape
* and dtype of the inputs.
* \return new_expr The modified expression.
Expand All @@ -136,8 +137,8 @@ using FTVMAlterOpLayout = runtime::TypedPackedFunc<
* \brief Legalizes an expression with another expression. This function will be
* invoked in Legalize pass. It is a target-dependent pass.
* \param attrs The attribute of the original node.
* \param inputs The input symbols of the original node.
* \param tinfos An array of placeholders, use for getting the inferred shape
* \param args The input symbols of the original node.
* \param arg_types An array of placeholders, use for getting the inferred shape
* and dtype of the inputs.
* \return new_expr The modified expression.
*/
Expand All @@ -146,6 +147,22 @@ using FTVMLegalize = runtime::TypedPackedFunc<
const Array<Expr>& args,
const Array<tvm::relay::Type>& arg_types)>;

/*!
* \brief Annotates an expression to indicate which external codegen tool an op
* should be scheduled to. It is a hardware dependent pass.
*
* \param attrs The attribute of the original expr.
* \param args The arguments of the original expr.
* \param compiler The external compiler that is used for external ops.
*
* \return true if this op should be registered with external codegen tool,
* otherwise, false.
*/
using FTVMExternOp = runtime::TypedPackedFunc<
bool(const Attrs& attrs, // NOLINT(*)
const Array<Expr>& args,
const std::string& compiler)>;

/*!
* \brief Forward rewriting rule for a specific op.
*
Expand Down
8 changes: 8 additions & 0 deletions include/tvm/relay/transform.h
Original file line number Diff line number Diff line change
Expand Up @@ -576,6 +576,14 @@ TVM_DLL Pass EtaExpand(bool expand_constructor, bool expand_global_var);
*/
TVM_DLL Pass PrintIR(bool show_meta_data = true);

/*!
* \brief Partition a Relay program into regions that can be executed on
* different backends.
*
* \return The pass.
*/
TVM_DLL Pass PartitionGraph();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should move away from graph terminology we should start to emphasize that we have more than a data-flow graph, this has led people to avoid scoping, effects, etc


} // namespace transform

/*!
Expand Down
57 changes: 57 additions & 0 deletions include/tvm/runtime/contrib/dnnl/dnnl_kernel.h
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

/*!
* \file include/tvm/runtime/contrib/dnnl/dnnl_kernel.h
* \brief Use external dnnl library kernels.
*/

#ifndef TVM_RUNTIME_CONTRIB_DNNL_DNNL_KERNEL_H_
#define TVM_RUNTIME_CONTRIB_DNNL_DNNL_KERNEL_H_

#include "dnnl.hpp"

namespace tvm {
namespace runtime {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The header should be part of the internal header, (move to src/) as we don't want to expose them to users' of tvm

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I intentionally moved it from src to header. The reason was because the wrapper could then just directly include it and export_libaray will then take care of finding the path under include/tvm here:

https://github.com/apache/incubator-tvm/blob/279a8ebae6d507f02d904397672dc44982719645/python/tvm/_ffi/libinfo.py#L183

Otherwise, We may expect users to pass -I$PATH_TO_TVM/src/runtime/contrib

namespace contrib {

using namespace dnnl;

extern "C" void dnnl_conv2d(float* data, float* weights, float* out, int p_N_,
int p_C_, int p_H_, int p_W_, int p_O_, int p_G_,
int p_Ph_, int p_Pw_, int p_Kh_, int p_Kw_,
int p_Sh_, int p_Sw_);

extern "C" void dnnl_dense(float* data, float* weight, float* out, int p_B_,
int p_I_, int p_O_);

extern "C" void dnnl_relu(float* data, float* out, int p_N_, int p_C_, int p_H_,
int p_W_);

extern "C" void dnnl_bn(float* data, float* gamma, float* beta, float* mean,
float* variance, float* out, int p_n_, int p_c_,
int p_h_, int p_w_, int p_e_);

extern "C" void dnnl_add(float* data, float* weight, float* out, int p_n_,
int p_c_, int p_h_, int p_w_);

} // namespace contrib
} // namespace runtime
} // namespace tvm
#endif // TVM_RUNTIME_CONTRIB_DNNL_DNNL_KERNEL_H_
11 changes: 10 additions & 1 deletion python/tvm/module.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,16 @@ def export_library(self,
self.save(path_obj)
files = [path_obj]
is_system_lib = self.type_key == "llvm" and self.get_function("__tvm_is_system_module")()
has_imported_c_file = False
if self.imported_modules:
for i, m in enumerate(self.imported_modules):
if m.type_key == "c":
has_imported_c_file = True
c_file_name = "tmp_" + str(i) + ".cc"
path_cc = temp.relpath(c_file_name)
with open(path_cc, "w") as f:
f.write(m.get_source())
files.append(path_cc)
path_cc = temp.relpath("devc.cc")
with open(path_cc, "w") as f:
f.write(_PackImportsToC(self, is_system_lib))
Expand All @@ -143,7 +152,7 @@ def export_library(self,
fcompile = _tar.tar
else:
fcompile = _cc.create_shared
if self.type_key == "c":
if self.type_key == "c" or has_imported_c_file:
options = []
if "options" in kwargs:
opts = kwargs["options"]
Expand Down
2 changes: 1 addition & 1 deletion python/tvm/relay/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
from . import adt
from . import analysis
from . import transform
from .build_module import build, create_executor, optimize
from .build_module import build, create_executor, optimize, build_extern
from .transform import build_config
from . import prelude
from . import parser
Expand Down
28 changes: 28 additions & 0 deletions python/tvm/relay/build_module.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@
from .module import Module as _Module
from .backend import interpreter as _interpreter
from .backend.vm import VMExecutor
from . import transform as _transform

def _update_target(target):
target = target if target else _target.current_target()
Expand Down Expand Up @@ -296,6 +297,33 @@ def optimize(mod, target=None, params=None):
return mod, params


def build_extern(mod, target):
"""Helper function that builds a Relay function to run on external codegen
tools.

Parameters
----------
mod : relay.Module
The module to build. Using relay.Function is deprecated.

target : str
The name of the external compilation target.

Returns
-------
mod : relay.Module
The relay module contains partitioned subgraphes for external codegen
tools.
"""
if isinstance(mod, _expr.Function):
mod = _Module.from_expr(mod)

seq = _transform.Sequential([_transform.ExternOp(target),
_transform.PartitionGraph()])
mod = seq(mod)
return mod


class GraphExecutor(_interpreter.Executor):
"""Wrapper around Executor interface.

Expand Down
2 changes: 1 addition & 1 deletion python/tvm/relay/op/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
# operator defs
from .op import get, register, register_schedule, register_compute, register_gradient, \
register_pattern, register_alter_op_layout, register_legalize, \
schedule_injective, Op, OpPattern, debug
register_extern_op, schedule_injective, Op, OpPattern, debug

# Operators
from .reduce import *
Expand Down
40 changes: 40 additions & 0 deletions python/tvm/relay/op/annotation/annotation.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ def stop_fusion(data):
"""
return _make.stop_fusion(data)


def checkpoint(data):
"""Annotate an expression to be a checkpoint for the checkpointing memory optimization.

Expand All @@ -78,3 +79,42 @@ def checkpoint(data):
return _make.checkpoint(data)

register_schedule("annotation.checkpoint", schedule_injective)


def subgraph_begin(data, compiler):
"""Annotate an expression to indicate that it is the beginning of
a subgraph.

Parameters
----------
data : tvm.relay.Expr
The expression to be annotated.

compiler : Str
The compiler used to generate code of a subgraph.

Returns
-------
result : tvm.relay.Expr
The annotated expression.
"""
return _make.subgraph_begin(data, compiler)


def subgraph_end(data, compiler):
"""Annotate an expression to indicate that it is the end of a subgraph.

Parameters
----------
data : tvm.relay.Expr
The expression to be annotated.

compiler : Str
The compiler used to generate code of a subgraph.

Returns
-------
result : tvm.relay.Expr
The annotated expression.
"""
return _make.subgraph_end(data, compiler)
1 change: 1 addition & 0 deletions python/tvm/relay/op/contrib/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,5 @@
"""Neural network related operators."""
from __future__ import absolute_import as _abs
from .contrib import *
from .extern_op import *
from . import _contrib
Loading