@ajtulloch @icemelon9 and I have been quietly hacking on a prototype the last
few months but have been busy with other things (such as VM 😄 ). We are going
to start pushing now, I opened a draft PR which will contain type checking
changes, and we will follow-up with code generation next.
One th
This looks like a great design, and should handle all the applications I'm
currently thinking of. @jroesch let us know if we can help.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/3042
I think for now it is okay to just support code gen with fully symbolic shape.
Later if we want specific optimization strategies for different programs, we
can revisit and adding more code gen strategies.
--
You are receiving this because you are subscribed to this thread.
Reply to this email d
Another immature idea is to unify composite types with operator's attributes
(but maybe far long-term)
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/3042#issuecomment-484789903
This is a great progress towards deploying real-world NLP models. Thanks Jared
for proposing this!
As far as I could tell, however, in the long term, we probably need more
support for more general dependent typing for more data types. I am afraid that
we will have to consider composite types in
# Supporting Dynamic Dimensions
I recently opened an RFC proposing a new dynamic runtime (see #2810).
A missing piece of the puzzle for supporting fully dynamic models is typing,
and code generation for tensors with statically unknown shapes.
There are three critical steps to supporting dynami