Skip to content

Conversation

@tqchen
Copy link
Member

@tqchen tqchen commented Jul 19, 2017

No description provided.

@tqchen tqchen merged commit 4f4cd5c into apache:master Jul 19, 2017
@tqchen tqchen deleted the ios branch July 19, 2017 18:47
vinx13 pushed a commit to vinx13/tvm that referenced this pull request Mar 9, 2022
gigiblender pushed a commit to gigiblender/tvm that referenced this pull request Jan 19, 2023
Previously, when a Relay function contains a Call which directly uses Tuples as arguments (the example below),
```
%25 = (%23, %24) /* ty=(Tensor[(1, 160), float32], Tensor[(1, 160), float32]) */;
%26 = concatenate(%25, axis=-1) /* ty=Tensor[(1, 320), float32] */;
```
our Relay-translator is unable to generate corresponding CallTIR, because the translator always assumes a argument of a Call is mapped to a single tensor (see the code snippet below: the translator directly passes the Relax variable `new_args[-1]` to function `te_tensors`, which translate a Var to a single tensor).
https://github.com/tlc-pack/relax/blob/60e9a01cdfdd013945790fc03d5abad29b8a7c0b/python/tvm/relax/testing/relay_translator.py#L124
https://github.com/tlc-pack/relax/blob/60e9a01cdfdd013945790fc03d5abad29b8a7c0b/src/relax/ir/emit_te.h#L56-L61

But in fact, the Relax variable may correspond to a Tuple of tensors, which wasn’t taken into consideration before. And such case can lead to error in `TETensor`, when creating tensors.

Therefore, this PR fixes the issue by examine the Relax variable before the tensor creation of Relay Call arguments. If an argument has shape Tuple and type TupleType, we break down the tuple Variable and emit a TupleGetItem for each field, and meanwhile create a tensor for each field.
gigiblender pushed a commit to gigiblender/tvm that referenced this pull request Jan 19, 2023
…pache#316)

This PR removes the `global_symbol` linkage added by Relay Translator. It also fixes unaddressed comments of apache#262.

All tests can pass locally and I believe it is safe to merge this PR directly.
junrushao pushed a commit to junrushao/tvm that referenced this pull request Feb 8, 2023
Previously, when a Relay function contains a Call which directly uses Tuples as arguments (the example below),
```
%25 = (%23, %24) /* ty=(Tensor[(1, 160), float32], Tensor[(1, 160), float32]) */;
%26 = concatenate(%25, axis=-1) /* ty=Tensor[(1, 320), float32] */;
```
our Relay-translator is unable to generate corresponding CallTIR, because the translator always assumes a argument of a Call is mapped to a single tensor (see the code snippet below: the translator directly passes the Relax variable `new_args[-1]` to function `te_tensors`, which translate a Var to a single tensor).
https://github.com/tlc-pack/relax/blob/60e9a01cdfdd013945790fc03d5abad29b8a7c0b/python/tvm/relax/testing/relay_translator.py#L124
https://github.com/tlc-pack/relax/blob/60e9a01cdfdd013945790fc03d5abad29b8a7c0b/src/relax/ir/emit_te.h#L56-L61

But in fact, the Relax variable may correspond to a Tuple of tensors, which wasn’t taken into consideration before. And such case can lead to error in `TETensor`, when creating tensors.

Therefore, this PR fixes the issue by examine the Relax variable before the tensor creation of Relay Call arguments. If an argument has shape Tuple and type TupleType, we break down the tuple Variable and emit a TupleGetItem for each field, and meanwhile create a tensor for each field.
junrushao pushed a commit to junrushao/tvm that referenced this pull request Feb 8, 2023
…pache#316)

This PR removes the `global_symbol` linkage added by Relay Translator. It also fixes unaddressed comments of apache#262.

All tests can pass locally and I believe it is safe to merge this PR directly.
yelite pushed a commit to yelite/tvm that referenced this pull request Feb 17, 2023
Previously, when a Relay function contains a Call which directly uses Tuples as arguments (the example below),
```
%25 = (%23, %24) /* ty=(Tensor[(1, 160), float32], Tensor[(1, 160), float32]) */;
%26 = concatenate(%25, axis=-1) /* ty=Tensor[(1, 320), float32] */;
```
our Relay-translator is unable to generate corresponding CallTIR, because the translator always assumes a argument of a Call is mapped to a single tensor (see the code snippet below: the translator directly passes the Relax variable `new_args[-1]` to function `te_tensors`, which translate a Var to a single tensor).
https://github.com/tlc-pack/relax/blob/60e9a01cdfdd013945790fc03d5abad29b8a7c0b/python/tvm/relax/testing/relay_translator.py#L124
https://github.com/tlc-pack/relax/blob/60e9a01cdfdd013945790fc03d5abad29b8a7c0b/src/relax/ir/emit_te.h#L56-L61

But in fact, the Relax variable may correspond to a Tuple of tensors, which wasn’t taken into consideration before. And such case can lead to error in `TETensor`, when creating tensors.

Therefore, this PR fixes the issue by examine the Relax variable before the tensor creation of Relay Call arguments. If an argument has shape Tuple and type TupleType, we break down the tuple Variable and emit a TupleGetItem for each field, and meanwhile create a tensor for each field.
yelite pushed a commit to yelite/tvm that referenced this pull request Feb 17, 2023
…pache#316)

This PR removes the `global_symbol` linkage added by Relay Translator. It also fixes unaddressed comments of apache#262.

All tests can pass locally and I believe it is safe to merge this PR directly.
MasterJH5574 pushed a commit to MasterJH5574/tvm that referenced this pull request Aug 17, 2025
junrushao added a commit to junrushao/tvm that referenced this pull request Nov 14, 2025
Upstream : https://github.com/apache/tvm-ffi.git
Branch   : main
New HEAD : ae346ec92a3c386f1376064ae086aae72947c329
Subject  : [DTYPE] Align bool parsing to align with DLPack (apache#262)
Author   : Tianqi Chen <tqchen@users.noreply.github.com>
Date     : 2025-11-14T18:40:40-05:00
Delta    : 1 commit(s) since 7f3f8726156a
Compare  : apache/tvm-ffi@7f3f872...ae346ec

This commit updates the tvm-ffi submodule to the latest upstream HEAD.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant