Hi All,

I'm having an issue with the following scenario with SPDK-based applications 
built w/DPDK
using CryptoDev in a primary/secondary process setup.

Primary Process: An application that does crypto and properly initializes 
aesni_mb
Secondary Process: An application that does not use crypto so does not 
initialize aesni_mb

What I'm seeing:

1) Primary Process: Initializes normally

2) Secondary Process: At DPDK initialization time 
rte_eal_init()->rte_bus_scan()->vdev_scan()
    Is called in that order and then in vdev_scan() there's a block of code 
that sends a msg to the primary
    process to get device names that it's initialized so the secondary ends up 
adding the aesni
    device and initializing it.

3) Process 2: Exits but because this SPDK application did not initialize 
crypto, it does
    not call rte_vdev_uninit() so on exit gets several memory leaks from 
allocations
    made during DPDK’s initialization of aesni, for example, alloc_mb_mgr()

It seems as though the secondary side should be tearing down aesni_mb assuming 
that
it was not the one who called rte_cryptodev_pmd_create() but I am fairly new to 
these
code sequences so am looking for some advice on what makes sense here to 
address the
issue.

thanks
Paul

Reply via email to