The reference integer implementation of tflite conv2d has optional bias
parameters which are pointers to 32 bit signed values, where a null pointer can
be used to skip the operation.
Is it the intention to always separate out the bias add operation in tvm apis?
one other comment ... I've seen a
@anijain2305 Could we also list the api of TFLite and QNNPACK? I think both
should be considered. Because we will parse TFLite model and QNNPACK is an good
quantization accelerated library
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view
@mshawcroft @weberlo @tmoreau89 please help to review if you have time and
https://docs.tvm.ai/contribute/code_review.html#approve-and-request-changes-explicitly
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://githu
@tqchen does this look good to you?
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/pull/3567#issuecomment-514055095
### QNN Conv2D operator
Tensorflow
~~~
tf.nn.quantized_conv2d(
input,
filter,
min_input,
max_input,
min_filter,
max_filter,
strides,
padding,
out_type=tf.dtypes.qint32,
dilations=[1, 1, 1, 1],
name=None
)
~~~
MxNet
~~~
mxnet.symbol.contrib.qua
@jnorwood Thanks for the comment. Both good points. I will keep those
abilities, though, outside of the scope of requantize op.
Another function (not necessarily a Relay operator) can take min/max, a config
(like nudge that zero is exactly representable) and generates scale and zero
point as per
There are a couple of things in this gemmlowp quantization example, which you
should perhaps consider how it could be supported in your requantize api.
1. ability to specify that range should be extended to include a zero value.
2. ability to specify that the range should be nudged so that exact
@anijain2305 Let me look at it afternoon of today or evening.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/3591#issuecomment-514031783
@anijain2305 to increase the throughput, can you also list other APIs, one per
post and we use lazy consensus, wait for a week to see people's feedback then
summarize.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https:
@FrozenGene @tqchen @u99127 Can you please approve the above API, so that we
can move to next discussion. I have so many things to discuss :)
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues
@tqchen My bad. The APIs started with `Serialize` and `Deserialize` are
actually not exposed.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/3594#issuecomment-514012903
Some high level comment: the interface of the serializer should be
implementation details and should not be exposed.
I think we should focus on the serialization format(text and bytecode), which
could be more important(as we can always change the implementations)
--
You are receiving this be
@icemelon9 The length is mainly for sanity check before we decode the
instructions. We could remove it. There could be multiple fields with variable
length. I thought we should always have a field in the fixed field to indicate
the length of the variable one, is this right?
For example, https:
@zhiics I mean if there are two fields with variable length, though not very
likely, what do you plan to support? Also do we expect every fields have same
data type? Another point Marisa is making that we don't need to put `length`
for every instructions since many have fixed length.
--
You ar
@icemelon9 we probably don't need to have the length for each field with
variable length because we should be able to derive it based on the fixed
fields? It means we usually put the length of it as a field of an instruction,
right?
--
You are receiving this because you are subscribed to this
@MarisaKirisame I think we need it to make deserialization easier. Otherwise,
we may need many checks.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/3594#issuecomment-513988425
@icemelon9 Yeah, thanks. Putting the `length` before the filed with variable
length seems reasonable.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/3594#issuecomment-513987982
I imagine not all opcode has variable length. Is there a need for the length
field? Can we just has a optional variable_length field that just store length
of all of the variable length field?
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or v
One question I have is whether we should put `length` of *all fields* in the
beginning, or just put the `length` of *one array field* `val*` immediately
before this field. It's possible that we may have an instruction with more than
one field that are arrays in the future. However, the current d
OK, so changes planned are:
- Move this to `src/runtime/micro/standalone`
- Rename flag from `MINIMAL_RUNTIME` to `MICRO_STANDALONE_RUNTIME`
- Fix CI
Will work on it right now, thank you folks.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or
To make this PR actionable, @ajtulloch can you decide on the name space
choices, make the changes, fix the CI and let us merge it in?
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/pull/3567#issu
@Shawhey, not sure why you modified topi and relay op, is it to add
broadcasting?
seems there is no need to touch topi and relay op because batch_matmul was
added in topi and relay a few months ago. To support tf BatchMatMul, I think we
should do is to add the support in python/tvm/relay/fron
22 matches
Mail list logo