| Issue |
173144
|
| Summary |
Why hasn't LLVM added official support for FP8 data types yet?
|
| Labels |
new issue
|
| Assignees |
|
| Reporter |
1801ZDL
|
Hi LLVM Community,
I'm currently working on adding FP8 data type support for a deep learning compiler, and I've encountered a need to leverage FP8 in LLVM-based toolchains. During my research, I noticed that there was a [related PR](https://github.com/llvm/llvm-project/pull/89900) proposing the addition of FP8 data types (including the common e4m3fn and e5m2 formats) to LLVM. However, this PR does not seem to have been merged into the main branch.
I'm writing to ask about the specific reasons why this FP8-related PR was not merged—whether it was due to technical design disagreements, incomplete implementation, lack of consensus on FP8 standardization, or other factors.
Additionally, as FP8 has become increasingly crucial for optimizing the performance and memory efficiency of large deep learning models (especially in training and inference scenarios), I would like to inquire about the LLVM community's roadmap for FP8 support. Is there a clear timeline for when FP8 data types will be officially incorporated into LLVM? Are there any ongoing efforts or new PRs that we can track or contribute to?
Any insights or updates on this topic would be greatly appreciated. Thank you very much for your time and help!
_______________________________________________
llvm-bugs mailing list
[email protected]
https://lists.llvm.org/cgi-bin/mailman/listinfo/llvm-bugs