i am wondering if there is any chance to introduce a quick way to compatible
with dynamic shapes?
as @cloudhan mentioned, TensorRT can let user set necessary input dimensions at
runtime, and auto compute other tensors' shape:
[Working With Dynamic
Shapes](https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#work_dynamic_shapes)
and i noticed, relay can parse models with dynamic shapes, but will failed at
relay.build() or vm.compile()
so, can we have some feature like:
`
mod, params = relay.frontend.from_tensorflow(...)
mod.get_tensor_by_name('input:0').set_shapes((...))
mod.auto_compute_shapes()
...
relay.build(mod, target)
`
to achieve the full support and optimization of dynamic shapes seems to be a
huge project, and i found a lot users show their interesting and concern about
this topic. i think maybe a little further step could be quickly done can be
helpful.
@kevinthesun , i'm a newbie in tvm, maybe what i thought is too simple as a
matter of course. just want to help, appreciate~
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/tvm/issues/4118#issuecomment-789559705