[quote="wrongtest, post:3, topic:7960"]
If I have some common neural network structure such as resnet50 at hand, can I 
just use autodiff to get backward computation graph?
[/quote]
graph-wise I think you can refer to 
[relay.transform.gradient](https://github.com/apache/incubator-tvm/blob/master/python/tvm/relay/transform/transform.py#L713)
 and as you lower the differentiated graph, you may leverage the tensor-level 
autodiff 
([te.gradient](https://github.com/apache/incubator-tvm/blob/master/python/tvm/te/autodiff.py#L22)).
 Though tensor gradients now are mostly manually written.
[quote="wrongtest, post:3, topic:7960"]
Is there some description about common ops which can be coveraged by autodiff?
[/quote]
You may refer to [test 
cases](https://github.com/apache/incubator-tvm/blob/master/tests/python/unittest/test_te_autodiff.py)

[quote="wrongtest, post:3, topic:7960"]
Can te.scan() be supported?
[/quote]
currently it is not supported.





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/rfc-differentiable-tensor-expression-create-and-verify-backward-op-automatically/7960/5)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/8df628e061512b8ab69723e918ccc46549cbf19313faa3902569142ab512b2c2).

Reply via email to