Sudhir, It is an error topology/configuration we don’t currently handle. Please try this and report back
https://gerrit.fd.io/r/c/vpp/+/32292 The behavior is container-1 will form one bonding group with container-2. It is with either BondEthernet0 or BondEthernet1. Steven From: <vpp-dev@lists.fd.io> on behalf of "Sudhir CR via lists.fd.io" <sudhir=rtbrick....@lists.fd.io> Reply-To: "sud...@rtbrick.com" <sud...@rtbrick.com> Date: Tuesday, May 11, 2021 at 7:30 PM To: "vpp-dev@lists.fd.io" <vpp-dev@lists.fd.io> Subject: [vpp-dev] observing issue with LACP port selection logic Hi all, i am configuring LACP between two containers. vpp version used : 20.09 topology looks like below [cid:image001.png@01D7473F.C0E0F900] in above topology since memif-4/4 interface is not part of same bond interface on both the containers (different partner system id) memif-4/4 should not be marked as active interface and attached to BondEthernet0 in container1 but is attaching to BondEthernet0. Any help in fixing the issue would be appreciated. Please find configuration in container1 : DBGvpp# show bond interface name sw_if_index mode load balance active members members BondEthernet0 9 lacp l23 3 3 DBGvpp# show bond details BondEthernet0 mode: lacp load balance: l23 number of active members: 3 memif2/2 memif3/3 memif4/4 number of members: 3 memif2/2 memif3/3 memif4/4 device instance: 0 interface id: 0 sw_if_index: 9 hw_if_index: 9 DBGvpp# show lacp actor state partner state interface name sw_if_index bond interface exp/def/dis/col/syn/agg/tim/act exp/def/dis/col/syn/agg/tim/act memif2/2 2 BondEthernet0 0 0 1 1 1 1 1 1 0 0 1 1 1 1 1 1 LAG ID: [(ffff,7a-67-1e-01-0c-02,0009,00ff,0001), (ffff,7a-37-f7-00-0c-02,000f,00ff,0001)] RX-state: CURRENT, TX-state: TRANSMIT, MUX-state: COLLECTING_DISTRIBUTING, PTX-state: PERIODIC_TX memif3/3 3 BondEthernet0 0 0 1 1 1 1 1 1 0 0 1 1 1 1 1 1 LAG ID: [(ffff,7a-67-1e-01-0c-02,0009,00ff,0002), (ffff,7a-37-f7-00-0c-02,000f,00ff,0002)] RX-state: CURRENT, TX-state: TRANSMIT, MUX-state: COLLECTING_DISTRIBUTING, PTX-state: PERIODIC_TX memif4/4 4 BondEthernet0 0 0 1 1 1 1 1 1 0 0 1 1 1 1 1 1 LAG ID: [(ffff,7a-67-1e-01-0c-02,0009,00ff,0003), (ffff,7a-37-f7-00-0c-04,0010,00ff,0001)] RX-state: CURRENT, TX-state: TRANSMIT, MUX-state: COLLECTING_DISTRIBUTING, PTX-state: PERIODIC_TX DBGvpp# Please find configuration in container2 : DBGvpp# show bond interface name sw_if_index mode load balance active members members BondEthernet0 15 lacp l23 2 2 BondEthernet1 16 lacp l23 1 1 DBGvpp# DBGvpp# DBGvpp# show bond details BondEthernet0 mode: lacp load balance: l23 number of active members: 2 memif2/2 memif3/3 number of members: 2 memif2/2 memif3/3 device instance: 0 interface id: 0 sw_if_index: 15 hw_if_index: 15 BondEthernet1 mode: lacp load balance: l23 number of active members: 1 memif4/4 number of members: 1 memif4/4 device instance: 1 interface id: 1 sw_if_index: 16 hw_if_index: 16 DBGvpp# show lacp actor state partner state interface name sw_if_index bond interface exp/def/dis/col/syn/agg/tim/act exp/def/dis/col/syn/agg/tim/act memif2/2 8 BondEthernet0 0 0 1 1 1 1 1 1 0 0 1 1 1 1 1 1 LAG ID: [(ffff,7a-37-f7-00-0c-02,000f,00ff,0001), (ffff,7a-67-1e-01-0c-02,0009,00ff,0001)] RX-state: CURRENT, TX-state: TRANSMIT, MUX-state: COLLECTING_DISTRIBUTING, PTX-state: PERIODIC_TX memif3/3 9 BondEthernet0 0 0 1 1 1 1 1 1 0 0 1 1 1 1 1 1 LAG ID: [(ffff,7a-37-f7-00-0c-02,000f,00ff,0002), (ffff,7a-67-1e-01-0c-02,0009,00ff,0002)] RX-state: CURRENT, TX-state: TRANSMIT, MUX-state: COLLECTING_DISTRIBUTING, PTX-state: PERIODIC_TX memif4/4 10 BondEthernet1 0 0 1 1 1 1 1 1 0 0 1 1 1 1 1 1 LAG ID: [(ffff,7a-37-f7-00-0c-04,0010,00ff,0001), (ffff,7a-67-1e-01-0c-02,0009,00ff,0003)] RX-state: CURRENT, TX-state: TRANSMIT, MUX-state: COLLECTING_DISTRIBUTING, PTX-state: PERIODIC_TX DBGvpp# Thanks and Regards, Sudhir
-=-=-=-=-=-=-=-=-=-=-=- Links: You receive all messages sent to this group. View/Reply Online (#19379): https://lists.fd.io/g/vpp-dev/message/19379 Mute This Topic: https://lists.fd.io/mt/82763645/21656 Group Owner: vpp-dev+ow...@lists.fd.io Unsubscribe: https://lists.fd.io/g/vpp-dev/unsub [arch...@mail-archive.com] -=-=-=-=-=-=-=-=-=-=-=-