https://bugs.kde.org/show_bug.cgi?id=497938
--- Comment #5 from Michael Miller <michael_mil...@msn.com> --- (In reply to chair-tweet-decal from comment #4) > Indeed, the fact that it's Python might complicate things. It would likely > be possible to export the model to ONNX, but that would require hosting the > converted model, as I don't think it's available in that format. Also, the > ONNX runtime can range from 50MB to 300MB depending on the OS and whether > there's GPU support. > > Not to mention, using ONNX could add extra complexity. > > There’s also LibTorch, which could be used to run the model without ONNX, > but you would still need to convert the model, which adds another dependency. Running an ONNX model isn't a problem. We already use several ONNX models in the face recognition engine, and soon to be release image classification engine. Since OpenCV is built into digiKam, we use the OpenCV ONNX (and other model types like Caffee and DarkNet) runtime. Cheers, Mike -- You are receiving this mail because: You are watching all bug changes.