[Apache TVM Discuss] [Development/pre-RFC] [RFC][Tensorize] Add "reduce_last" property for TensorIntrin to support activation fusion

2021-11-03 Thread Zhuwenxi via Apache TVM Discuss
**Motivation:** Existing `TensorIntrin` has "reduce_init" and "reduce_update" to support the tensorization of reduce_axis == 0 and reduce_axis > 0 specifically, which is already well suited for many cases. However, the support for activation fusion is still missing, because it lacks of facili

[Apache TVM Discuss] [Development/pre-RFC] [pre-RFC] Compilation Configuration Representation

2021-11-03 Thread Manupa Karunaratne via Apache TVM Discuss
Hi @tqchen and @zxybach, cc : @mbaret What is a Composite Target ? TVM being a multi-target compiler, it would be a bit confusing to use a Array of Targets as another Composite Target -- I think its the terminology what is confusing here. A composite target sounds like a target that codegen

[Apache TVM Discuss] [Development/pre-RFC] [pre-RFC] Compilation Configuration Representation

2021-11-03 Thread tqchen via Apache TVM Discuss
https://discuss.tvm.apache.org/t/rfc-composite-target/7744 --- [Visit Topic](https://discuss.tvm.apache.org/t/pre-rfc-compilation-configuration-representation/11372/7) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here

[Apache TVM Discuss] [Development/pre-RFC] [pre-RFC] Compilation Configuration Representation

2021-11-03 Thread tqchen via Apache TVM Discuss
Thanks for the discussions. To begin with, I am not that attached to the particular choice of name. We can for example, decide to introduce another target kind ("hetero-target", "myawesome-target", "platform", "CompilationOption"). I think our discussion boils down to the following quenstion

Re: [apache/tvm-rfcs] [RFC][TIR] Layout transformations on buffer access (#39)

2021-11-03 Thread Lunderberg
> Usage of te.AXIS_SEPARATOR: It seems this is only used in the API side but > not in BufferTransform, would be good to get some clarification. That's correct, the `te.AXIS_SEPARATOR` only appears in the API for the TE schedules, and not in the TIR graph generated from the TE schedule. I've up

Re: [apache/tvm-rfcs] [RFC][TIR] Layout transformations on buffer access (#39)

2021-11-03 Thread Wuwei Lin
Thanks for adding the discussion points. I understand the difficulty implementing it as eager transform in TE, mainly because most other schedule primitives were not done eagerly as in TIR. So adding a rewrite pass for `BufferTransform` makes sense to me. > Should BufferTransform apply only to

[Apache TVM Discuss] [Development/pre-RFC] [pre-RFC] Compilation Configuration Representation

2021-11-03 Thread Cody H. Yu via Apache TVM Discuss
I agree with @tqchen that improving composite targets could be more beneficial and general. We (with @junrushao1994 and @zhiics) previously attempted to improve the target system to allow more flexible attributes, such as a pass sequence / runtime / etc specifically for the target, which is ve

[Apache TVM Discuss] [Development/pre-RFC] [pre-RFC] Compilation Configuration Representation

2021-11-03 Thread Junru Shao via Apache TVM Discuss
Thank you @Mousius for the RFC! It's great to read about potential user experience issues of the current Target system, and happy to discuss about potential ways to improve it. ## Proposeds API in the RFC `CompilationConfig`, as proposed in this RFC, aims to improve UX by wrapping a list of

Re: [apache/tvm-rfcs] [RFC][TIR] Layout transformations on buffer access (#39)

2021-11-03 Thread Lunderberg
> Since Option2 suggests the transform is global, shall we consider > BufferTransform being part of function attribute? I had initially placed `BufferTransform` as a statement so that it could be possible to extended it to have a transformation defined by references to variables within the func

Re: [apache/tvm] Apache TVM v0.8 Release Note Candidate (Issue #9416)

2021-11-03 Thread Jason
> > Should we wait for PyTorch TVM PR #8777? It should be merged soon. > > @masahi we can wait for it if this PR could get in this week Does this mean we will update v0.8 branch again this week, I merged a new pull request this week early https://github.com/apache/tvm/pull/9428 , will this be

Re: [apache/tvm] Apache TVM v0.8 Release Note Candidate (Issue #9416)

2021-11-03 Thread Junru Shao
@jiangjiajun Yes, we will update v0.8 branch and cut a release candidate on Nov 8, 2021. After the cut, we will ask the community and PMC members to test to release, and if there is no regression we will make the release official -- You are receiving this because you are subscribed to this thre

Re: [apache/tvm] Apache TVM v0.8 Release Note Candidate (Issue #9416)

2021-11-03 Thread Junru Shao
@jiangjiajun Yes, we will update v0.8 branch and cut a release candidate on Nov 8, 2021. After the cut, we will ask the community and PMC members to test to release, and if there is no regression we will make the release official -- You are receiving this because you are subscribed to this thre