On Fri, Sep 18, 2020 at 02:13:02AM +0000, Honnappa Nagarahalli wrote: > <snip> > > > > > > >>> > > >> > diff --git a/lib/librte_eal/include/rte_eal.h > > >> b/lib/librte_eal/include/rte_eal.h > > >>>> index ddcf6a2e7a..8148f650f2 100644 > > >>>> --- a/lib/librte_eal/include/rte_eal.h > > >>>> +++ b/lib/librte_eal/include/rte_eal.h > > >>>> @@ -43,6 +43,13 @@ enum rte_proc_type_t { > > >>>> RTE_PROC_INVALID > > >>>> }; > > >>>> > > >>>> +enum rte_max_simd_t { > > >>> We could add a RTE_MAX_SIMD = 0. Arm platforms can use this to > > >>> choose > > >> SVE. > > >>> > > >> > > >> Is zero the best value for this? Would setting it to MAX_INT or some > > >> other big number be better, in terms of comparisons operations, or > > >> does that just not apply at all with SVE? > > > I suggested zero as the bitwidth can be specified from the command line. > > > It > > would be much easier to input zero vs other number. > > > > Right, but it doesn't end up being that intuitive as interface > > 0 is enabled, 64 is not, 128 is enabled etc .... > > > > Suggest we use a max 16bit integer as 0xFFFF? > I think there are 2 things here: > 1) What is the internal representation (for ex: the value of the enum)? Here > assigning 0xFFFF should be fine. > 2) The input value at the command line. Is it possible to say that, if the > user does not provide anything, then we set the option as 0xFFFF? This would > mean that SVE would be used by default on Arm platforms (which is ok for me). >
Make sense. That all is perfectly doable because the initial default value is set per architecture.