Skip to content

Conversation

@tqchen
Copy link
Member

@tqchen tqchen commented Jan 3, 2020

Currently, we use a tvm::Var to represent a placeholder for shapes in generic types.
This is not necessary for GlobalTypeVar(as we never parameterize by shape var),
and is a bit twisted for TypeVar.

As we move to a unified type system, we want to break the dependency
from the base TypeVar(which is shared across the languages) from the expression.
Note that it is fine for TensorType to depend on Expr.

One alternative solution to embed the Var would be to introduce a TypeVarExpr,
which can wrap a TypeVar as Expr. However, this new alternative won't be
natural until we migrate the type to the global scope.
Given that have not yet start to depend on the shape parameterization heavily yet, we propose to carry out the TypeVar change first.

This PR removes the tvm::Var from the typevars. We will follow up with another
PR to migrate the types to a base location. After that, we should be able to
use the more elegant approach via TypeVarExpr.

…nd TypeVar

Currently, we use a tvm::Var to represent a placeholder for shapes in generic types.
This is not necessary for GlobalTypeVar(as we never parameterize by shape var),
and is a bit twisted for TypeVar.

As we move to a unified type system, we want to break the dependency
from the base TypeVar(which is shared across the languages) from the expression.
Note that it is fine for TensorType to depend on Expr.

One alternative solution to embed the Var would be to introduce a TypeVarExpr,
which can wrap a TypeVar as Expr. However, this new alternative won't be
natural until we migrate the type to the global scope.

Lucikly, we have not yet start to depend on the shape parameterization heavily yet.

This PR removes the tvm::Var from the typevars. We will follow up with another
PR to migrate the types to a base location. After that, we should be able to
use the more elegant approach via TypeVarExpr.
@tqchen
Copy link
Member Author

tqchen commented Jan 3, 2020

Copy link
Member

@zhiics zhiics left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@MarisaKirisame
Copy link
Contributor

MarisaKirisame commented Jan 4, 2020

having type that use expr is called dependent type. typically, instead of typevarexpr, ppl unify the "type", "kind" and "expr" data type into one single type. Why dont we merge type and expr? It will, e.g. make attribute not special anymore, become first class citizen so one can very easily define function in tvm that basically call another function. right now it is not possible unless you do it in C++, because you have to fill out typerelation and do dispatching yourself there.
@jroesch

@tqchen
Copy link
Member Author

tqchen commented Jan 4, 2020

@MarisaKirisame I think we could bring a separate discuss thread in the forum for the type system design. My previous takeaway with @jroesch was that we don't want to introduce a full dependent type.

Currently the Expr used in the TensorType is not part of relay expression, and only works on integer arithmetics to expression shape relations.

@tqchen tqchen merged commit 24e6fcb into apache:master Jan 4, 2020
@tqchen tqchen deleted the typevar branch January 6, 2020 19:19
alexwong pushed a commit to alexwong/tvm that referenced this pull request Feb 26, 2020
…nd TypeVar (apache#4615)

Currently, we use a tvm::Var to represent a placeholder for shapes in generic types.
This is not necessary for GlobalTypeVar(as we never parameterize by shape var),
and is a bit twisted for TypeVar.

As we move to a unified type system, we want to break the dependency
from the base TypeVar(which is shared across the languages) from the expression.
Note that it is fine for TensorType to depend on Expr.

One alternative solution to embed the Var would be to introduce a TypeVarExpr,
which can wrap a TypeVar as Expr. However, this new alternative won't be
natural until we migrate the type to the global scope.

Lucikly, we have not yet start to depend on the shape parameterization heavily yet.

This PR removes the tvm::Var from the typevars. We will follow up with another
PR to migrate the types to a base location. After that, we should be able to
use the more elegant approach via TypeVarExpr.
alexwong pushed a commit to alexwong/tvm that referenced this pull request Feb 28, 2020
…nd TypeVar (apache#4615)

Currently, we use a tvm::Var to represent a placeholder for shapes in generic types.
This is not necessary for GlobalTypeVar(as we never parameterize by shape var),
and is a bit twisted for TypeVar.

As we move to a unified type system, we want to break the dependency
from the base TypeVar(which is shared across the languages) from the expression.
Note that it is fine for TensorType to depend on Expr.

One alternative solution to embed the Var would be to introduce a TypeVarExpr,
which can wrap a TypeVar as Expr. However, this new alternative won't be
natural until we migrate the type to the global scope.

Lucikly, we have not yet start to depend on the shape parameterization heavily yet.

This PR removes the tvm::Var from the typevars. We will follow up with another
PR to migrate the types to a base location. After that, we should be able to
use the more elegant approach via TypeVarExpr.
zhiics pushed a commit to neo-ai/tvm that referenced this pull request Mar 2, 2020
…nd TypeVar (apache#4615)

Currently, we use a tvm::Var to represent a placeholder for shapes in generic types.
This is not necessary for GlobalTypeVar(as we never parameterize by shape var),
and is a bit twisted for TypeVar.

As we move to a unified type system, we want to break the dependency
from the base TypeVar(which is shared across the languages) from the expression.
Note that it is fine for TensorType to depend on Expr.

One alternative solution to embed the Var would be to introduce a TypeVarExpr,
which can wrap a TypeVar as Expr. However, this new alternative won't be
natural until we migrate the type to the global scope.

Lucikly, we have not yet start to depend on the shape parameterization heavily yet.

This PR removes the tvm::Var from the typevars. We will follow up with another
PR to migrate the types to a base location. After that, we should be able to
use the more elegant approach via TypeVarExpr.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants