What relay expands to is memory copy. I want to avoid that. I want to have a
copy-less representation in TIR.
This should really be a no-op, but ends up copying everything.
```
import tensorflow as tf
import tvm
import tvm.relay
g = tf.Graph()
with g.as_default():
u = tf.unstack(tf.placeh
TensorArray is supported in Relay and TF TensorArray ops can be converted now.
Did you mean something more than these?
---
[Visit Topic](https://discuss.tvm.ai/t/tensor-arrays-in-tir/7135/3) to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from thes
It is necessary for many usecases (like AOT), and I believe @tqchen has some
idea on this too.
---
[Visit Topic](https://discuss.tvm.ai/t/tensor-arrays-in-tir/7135/2) to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click
here](
Is there any effort to support tensor arrays in TIR? That would be something
to represent operations like `stack` or `unstack` from TF.
Let's say we want to write an op that does a concatenation of a variable number
of tensors, but without actually copying any data. Instead, it would create