Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2024-01-22 Thread Yuchen Jin
> It's worth noting that with the merging of Unity into TVM's main branch, > Relax has already been _de facto_ upstreamed. 🥳 -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm-rfcs/pull/89#issuecomment-1904969432 You are receiving this because you are subscribe

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2024-01-22 Thread Tianqi Chen
indeed, check out https://github.com/apache/tvm/issues/16446 -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm-rfcs/pull/89#issuecomment-1904964309 You are receiving this because you are subscribed to this thread. Message ID:

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2024-01-22 Thread Tianqi Chen
Closed #89. -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm-rfcs/pull/89#event-11562870443 You are receiving this because you are subscribed to this thread. Message ID:

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2024-01-22 Thread Steven S. Lyubomirsky
It's worth noting that with the merging of Unity into TVM's main branch, Relax has already been _de facto_ upstreamed. -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm-rfcs/pull/89#issuecomment-1904942456 You are receiving this because you are subscribed to thi

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2023-09-26 Thread Tianqi Chen
Thanks @FrozenGene for bring this up! To bring broader awareness, we posted a new strategy proposal here https://discuss.tvm.apache.org/t/discuss-tvm-core-strategy-for-emerging-needs/15751 to concretely enable LLMs and other usecases -- Reply to this email directly or view it on GitHub: https:

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2023-09-13 Thread Zhao Wu
I want to know do we have plan to decide when to merge Unity branch into main branch? As LLM is so popular now, without Unity, we can not support it well. -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm-rfcs/pull/89#issuecomment-1717067037 You are receiving t

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2023-09-12 Thread Tianqi Chen
sending another reminder for everyone to chime into related unity discussion threads https://discuss.tvm.apache.org/c/development/unity/14, love to see your participations on all the technical discussions and see the how we can collectively address your needs -- Reply to this email directly o

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2023-07-26 Thread Tianqi Chen
Just another update and gentle reminder, it is great to see unity being developed and used for dynamic shape and emerging usecases One goal of the G1 is to give some time answer question. There are more topics of related interest(some might related to the questions in this thread https://discus

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2023-04-04 Thread Tianqi Chen
An update, thanks to effort of many community members we are now at a period where the initial foundational items in unity branch are now established. One goal of the G1 is to give some time answer questions, and provide examples to those who have shared needs to have more detailed evidence and

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2023-01-25 Thread Tianqi Chen
Five years ago, we started with a community that comes with a common vision in mind – enabling machine learning engineers to optimize and run computations efficiently on any hardware backend. Five years later, the fields of machine learning and MLC(ML compilation) have gone under rapid changes

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2023-01-12 Thread HLearning
need to update it quickly -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm-rfcs/pull/89#issuecomment-1381231123 You are receiving this because you are subscribed to this thread. Message ID:

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2023-01-12 Thread Junru Shao
My position: - Relay and Relax is going to co-exist as parallel submodules in TVM, and one should not affect the other at all; - Committed to keeping Relay source code in "main" in the foreseeable future without hinting about potential deprecation; - Having Relax in "main" >>> having Relax in a s

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-12-04 Thread Tianqi Chen
After seeing so many voices in this thread. I think it is important to provide a reply here. I am wearing the Apache TVM hat as a ASF member and Apache TVM PMC member. First of all, I would like to say thank you, everyone, for sharing your voices here. This post has received support from more

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-11-10 Thread Yuchen Jin
Thanks everyone for the discussions! A brief recap of our discussions so far: - We are certain that Relax supports dynamic-shape workloads that are not supported by the current TVM, which can immediately benefit many community members and users. - For why Relax should be brought into the projec

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-11-09 Thread Ligeng Zhu
I learn a lot from reading through the thread, and find most people here are from a system background: either doing related research in schools or heading an engineering team in companies. I would like to share some of my thoughts from a different perspective, as a **TVM user** and **ML algorit

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-20 Thread Yuchen Jin
There were concerns that bought up in [RFC #95](https://github.com/apache/tvm-rfcs/pull/95) that this RFC conversation did not cover "how proposal fit into TVM". We agree that discussing the fit is important and would like to refer to related conversations and sections: - https://github.com/Yu

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-08 Thread Zhi
Based on my experience at several organizations, dynamic shape support is obviously very important, particularly along with the popularity of large language models. Also, efficiently supporting dynamic shape would be one of the major appealing features of a "modern" DLC. I think the above commen

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-07 Thread Zihao Ye
I'm a graduate researcher at UW and have been working as a full-time SDE at AWS AI for years, mostly around Deep Learning Frameworks Libraries. I feel like all of us agree dynamic shapes are essential so I don't want to spend more time emphasizing how important it is. I'm not a contributor to Re

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-06 Thread Steven S. Lyubomirsky
For those interested, I think [this recent paper](https://arxiv.org/pdf/2210.02374.pdf) shows one way as to how symbolic shapes could be make to work with Relay's type checking approach (Axon is clearly structured very similarly to Relay), though it would require substantially reworking the exi

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-05 Thread evan lu
Thanks for this great job ! Based on our experience at Meituan, dynamic shape is important for our use cases, e.g. OCR and ASR models with dynamic seq_len. Now we could solve these with relax and vm runtime :) -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm-r

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-05 Thread wrongtest
In Intellif, people build, maintain and extend the DL compilation stack with Relay in past years. However, we never think the upstreaming of a new module would break existing functionalities or cause confusions, but huge opportunities to solve many technical issues which prove to be not so easy

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-04 Thread Yuchen Jin
Thanks everyone for the feedback. One thing that we seem to agree on is that there is a strong need to support symbolic shape use cases for TVM, as represented by many of the folks who chimed into this thread. Hopefully, we all agree that there is a strong need to support robust and high-qualit

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-04 Thread Lesheng Jin
As an individual Relax contributor from UCSD, I don’t feel our community has created an environment that welcomes new contributors and new contributions. I am happy to see different opinions in the discussion, but it's really shocking and disappointing that we are afraid of moving forward even w

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-04 Thread Junru Shao
Thanks for the discussion so far! Wearing Apache TVM hat, I would love to see our community making progress to satisfy broader community and work with the trend of deep learning compilation, rather than being gridlocked by a single party of interest. -- Reply to this email directly or view it

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-04 Thread Christopher Sidebottom
> @Mousius In this case, @YuchenJin 's reply clearly articulated that there is > a close co-design of these factors, and changing to adopt dynamic alone would > imply a one-step jump to relax -- which is not incremental. The data > structure change would come with a set of supporting infra and c

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-04 Thread Tianqi Chen
@Mousius In this case, @YuchenJin 's reply clearly articulated that there is a close co-design of these factors, and changing to adopt dynamic alone would imply a one-step jump to relax -- which is not incremental. The data structure change would come with a set of supporting infra and co-desig

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-03 Thread Cody Yu
In addition to the use cases and experience I've mentioned previously, I want to further highlight that symbolic shape support becomes even more important in these months, mainly due to the requirements of deploying decoder models (e.g., GPT). Since the text generation process is a natural dynam

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-03 Thread Zhao Wu
Based on our experience at NIO, dynamic shape support in Relax is **extremely** important for us. In fact, we have done many things on Relay trying to cover dynamic shape support on our user cases, however lack of first class support for symbolic dynamic shape still constraint us some ops / patt

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-03 Thread Ziheng Jiang
Thanks @YuchenJin for updating this RFC! >From the use-cases that I have observed in my experience, the symbolic shape >capabilities allows TVM to handle dynamic workloads that cannot be handled in >other frameworks and get more widely adopted. And we can quickly enable >iterations of unity co

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-10-03 Thread Tianqi Chen
Thanks everyone for discussions so far. There are a lot of conversations around the clarifications and issues being answered. Two of the main concern being raised so far seems to be: - rationales and alternatives. - overall scope and execution In this case, @YuchenJin have: - Updated the RFC w

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-08-24 Thread Yuchen Jin
Having taken onboard the feedback from community members (acknowledge the reviewers here), a number of us involved in this RFC (@YuchenJin, @jwfromm, @tqchen, @areusch, @mbaret, @jroesch, @tmoreau89) feel it’s necessary to be explicit about the scope of this proposal, and we apologize to those r

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-08-23 Thread Steven S. Lyubomirsky
On the point about potentially incorporating symbolic shapes into Relay, I would like to hear more detail about how it can be done with Relay's system of accumulating type constraints and solving them simultaneously. If we were to handle dynamic shapes in Relay, we would need to define semantics

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-08-23 Thread Tianqi Chen
Thank you, everyone, for the discussions here. Let us take a step back and look at the non-technical parts of the conversation. A lot of our discussions come from two goals: G0: Maintaining a stable evolution solution for some of our common use-cases G1: Welcome new improvements, land our techni

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-08-23 Thread masahi
@YuchenJin > Relax can be viewed as complementary to Relay. Relay focuses on high-level op > transformations, while the current Relax passes focus on TIR-graph > co-transformations that can enable flexible fusion and layout rewrite, which > is hard to achieve in Relay. I like this separation

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-08-22 Thread Junru Shao
Thank you @leandron @ekalda for the questions, and @zhiics, @slyubomirsky, @Hzfengsy, @sunggg for the discussion! As a long-term contributor since 2018, the pre-Relay era, and the initiator and top 2 contributors of RAF ([https://github.com/awslabs/raf/](https://github.com/awslabs/raf/)), the

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-08-19 Thread Cody Yu
Thanks fo the RFC. Although I didn't involve the actual Relax development, I've been attended the weekly open design review meeting for a while and I'm glad that I could share our experience to help improve the Relax design. Thus, I don't have specific questions to the design. Regarding to the

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-08-19 Thread Siyuan Feng
Thanks @leandron and @ekalda for the comments. We all agree that we are trying to improve the graph-level IR of TVM while the controversial point is that if we can enhance relay to support features from relax. Let's discuss it directly and focus on the technical points themselves. First of all

Re: [apache/tvm-rfcs] [RFC] Relax Upstreaming (PR #89)

2022-08-18 Thread Yuchen Jin
Hi @leandron, thanks for your feedback! :) We share a common goal of minimizing disruption while incrementally improving TVM. One of the main questions is how to bring in the improvements. That’s indeed what we have carefully thought about. One thing we found in building the unity connection i