<snip>

> > From: Jerin Jacob [mailto:jerinjac...@gmail.com]
> > Sent: Tuesday, 16 August 2022 15.13
> >
> > On Wed, Aug 3, 2022 at 8:49 PM Stephen Hemminger
> > <step...@networkplumber.org> wrote:
> > >
> > > On Wed, 3 Aug 2022 18:58:37 +0530
> > > <jer...@marvell.com> wrote:
> > >
> > > > Roadmap
> > > > -------
> > > > 1) Address the comments for this RFC.
> > > > 2) Common code for mldev
> > > > 3) SW mldev driver based on TVM (https://tvm.apache.org/)
> > >
> > > Having a SW implementation is important because then it can be
> > covered
> > > by tests.
> >
> > Yes. That reason for adding TVM based SW driver as item (3).
> >
> > Is there any other high level or API level comments before proceeding
> > with v1 and implementation.
> 
> Have you seriously considered if the DPDK Project is the best home for this
> project? I can easily imagine the DPDK development process being a hindrance
> in many aspects for an evolving AI/ML library. Off the top of my head, it 
> would
> probably be better off as a separate project, like SPDK.
There is a lot of talk about using ML in networking workloads. Although, I am 
not very sure on how the use case looks like. For ex: is the inference engine 
going to be inline (i.e. the packet goes through the inference engine before 
coming to the CPU and provide some data (what sort of data?)), look aside (does 
it require the packets to be sent to the inference engine or is it some other 
data?), what would be an end to end use case? A sample application using these 
APIs would be helpful.

IMO, if we need to share the packets with the inference engine, then it fits 
into DPDK.

As I understand, there are many mature open source projects for ML/inference 
outside of DPDK. Does it make sense for DPDK to adopt those projects rather 
than inventing our own?

> 
> If all this stuff can be completely omitted at build time, I have no 
> objections.
> 
> A small note about naming (not intending to start a flame war, so please feel
> free to ignore!): I haven't worked seriously with ML/AI since university three
> decades ago, so I'm quite rusty in the domain. However, I don't see any
> Machine Learning functions proposed by this API. The library provides an API 
> to
> an Inference Engine - but nobody says the inference model stems from
> Machine Learning; it might as well be a hand crafted model. Do you plan to
> propose APIs for training the models? If not, the name of the library could
> confuse some potential users.
I think, at least on the edge devices, we need an inference device as ML 
requires more cycles/power.
 
> 
> > Or Anyone else interested to review or contribute to this new DPDK
> > device class?

Reply via email to