https://tvm.apache.org/docs/tutorials/frontend/deploy_sparse.html?highlight=save_json
check this turtorial out, you will find answers there.
---
[Visit
Topic](https://discuss.tvm.apache.org/t/importing-relay-irmodule-from-text-file/9297/2)
to respond.
You are receiving this because you en
Hi All,
Is there a way to return intermediate values inside an IRModule as an output?
Here is an example. How can I return %1 or %0 as an output? from the IRModule
or form relay.Function? @mbrookhart any suggestions?
```
def @main(%x1: Tensor[(1, 1, 1, 20), float32], %y1: Tensor[(1, 1,
When you construct a relay Function, you can wrap multiple output variables in
relay.Tuple and return that as the output of the Function.
---
[Visit
Topic](https://discuss.tvm.apache.org/t/changing-return-of-relay-function-or-irmodule/9307/2)
to respond.
You are receiving this because yo
Hi @mbrookhart
Could you please elaborate how I wrap multiple outputs for relay.Fuction?
This is my attempt to add an additional "return" to a relay function. In the
visit_function(), I tried to reconstruct the function, but the relay.Function
prototype does not have any place that I can wr
Try this:
```
new_body = self.visit(fn.body)
print("Visited all", new_body)
return_values_to_function = relay.Tuple([new_body] + self.return_values)
func = relay.Function(fn.params, return_values_to_function,
fn.ret_type, fn.type_params, fn.attrs)
```
---
[V
Thanks, @mbrookhart. I think that is promising. However, now my "new_body"
becomes "None" which causes a (surely) error. This is caused by my version
over-written visit_call() function as follows:
I was hoping to save the "result" of add operator to self.return_values with
the following cod
return this `super().visit_call(call)`
---
[Visit
Topic](https://discuss.tvm.apache.org/t/changing-return-of-relay-function-or-irmodule/9307/6)
to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click
here](https://discuss.tvm.ap
Ohh, I thought my bad! Thanks a lot. There is only a slight problem.
This is what I get. It adds an additional statement into relay IR ( %3 =
add(%x1, %y1);) and returns the result.
```
fn (%x1: Tensor[(1, 1, 1, 20), float32], %y1: Tensor[(1, 1, 1, 20), float32]) {
%0 = add(%x1, %y1);
%
[quote="jmatai1, post:5, topic:9307"]
```
if call.op.name =="add":
# save this call for return!
self.return_values.append(call)
super().visit_call(call)
```
[/quote]
:point_down:
```
post = super().visit_call(call)
if post.op.name =="add":
Awesome! Thanks and I believe it works.
[quote="mbrookhart, post:8, topic:9307"]
You were storing the add pre-mutation.
[/quote]
---
[Visit
Topic](https://discuss.tvm.apache.org/t/changing-return-of-relay-function-or-irmodule/9307/9)
to respond.
You are receiving this because you enabl
I posted this as an issue in GitHub before, and reopen it here according to
@tqchen's suggestion.
>I'm using TVM as front end, and need to know the layout of each operator in
>implement. Operators like conv2d have "data_layout", "kernel_layout" in it's
>attrs field, which I could use directly,
Compiling deep learning-based models such as PyTorch models are giving poor
inference time with TVM compared to non-TVM approaches. Any particular reason?
---
[Visit
Topic](https://discuss.tvm.apache.org/t/pytorch-vs-tensorflow-tvm/9313/1) to
respond.
You are receiving this because you e
12 matches
Mail list logo