[Apache TVM Discuss] [Development/RFC] [Discussion/Alignment] Memory Planning

2021-05-25 Thread Rafael Stahl via Apache TVM Discuss
My proposal is now implemented. I ended up completely replacing the content of graph_plan_memory.cc with a python implementation: - Redirect to Python: https://github.com/tum-ei-eda/tvm/blob/e9184d948edd58635e79c3f21355f2b83b361401/src/relay/backend/graph_plan_memory.cc#L890 - Main implementat

[Apache TVM Discuss] [Development/pre-RFC] [Discussion/Alignment] Memory Planning

2021-06-02 Thread Rafael Stahl via Apache TVM Discuss
Hi @aca88 thanks for your interest! For the evaluated models, we just used a single schedule as given by TVM: https://github.com/tum-ei-eda/tvm/blob/tumeda_memplan/python/tvm/relay/memplan.py#L187 You are right that for more complex graphs, we would have to evaluate more schedules to find t

[Apache TVM Discuss] [Development] Relay Function virtual_device property

2022-06-13 Thread Rafael Stahl via Apache TVM Discuss
While switching to TVMC, I noticed a "virtual_device" property on the top-level relay module function. It was not properly propagated through my relay passes and caused an assertion in lowering to TE, with: Check failed: (!virtual_device->IsFullyUnconstrained()) is false at: ``` File "

[Apache TVM Discuss] [Development] Relay Function virtual_device property

2022-06-15 Thread Rafael Stahl via Apache TVM Discuss
Hi Mark, thank you for clarifying. If I'm not interested in using the virtual_device feature, is there a way to disable it? The issue is that without the patch above, it is not possible to use any pass that is based on the ExprMutator in Python, because the TE Compiler complains with the abov