From: vpp-dev@lists.fd.io <vpp-dev@lists.fd.io> on behalf of Sudhir CR via 
lists.fd.io <sudhir=rtbrick....@lists.fd.io>
Date: Thursday, 10 June 2021 at 08:50
To: vpp-dev@lists.fd.io <vpp-dev@lists.fd.io>
Subject: [vpp-dev] vpp hangs with bfd configuration
Hi All,
when we are trying to establish a BFD session between two containers while 
processing "adj_bfd_notify ''  VPP went into an infinite loop and hung in one 
of the containers, and this issue is reproducible every time with below 
topology and configuration.

Any help in fixing the issue would be appreciated.

Topology:

  Container1 (memif32321/32321)  -----------------  (memif32321/32321)Container2

Configuration:
Container1
========
set interface ip address memif32321/32321 4.4.4.4/24<http://4.4.4.4/24>
ip table add 100
ip route add 4.4.4.0/24<http://4.4.4.0/24> table 100 via 4.4.4.5 
memif32321/32321 out-labels 4444
ip route add 4.4.4.5/32<http://4.4.4.5/32> table 100 via 4.4.4.5 
memif32321/32321 out-labels 4444

set interface mpls memif32321/32321 enable
mpls local-label add 8888 eos via 4.4.4.5 memif32321/32321 ip4-lookup-in-table 
100

what’s the intent here? Do you want to forward via the memif or do a lookup, 
you can’t do both.
Fix that and see if it helps.

/neale


bfd udp session add interface memif32321/32321 local-addr 4.4.4.4 peer-addr 
4.4.4.5 desired-min-tx 400000 required-min-rx 400000 detect-mult 3

Container2
========
set interface ip address memif32321/32321 4.4.4.5/24<http://4.4.4.5/24>
ip table add 100
ip route add 4.4.4.0/24<http://4.4.4.0/24> table 100 via 4.4.4.4 
memif32321/32321 out-labels 8888
ip route add 4.4.4.4/32<http://4.4.4.4/32> table 100 via 4.4.4.4 
memif32321/32321 out-labels 8888
set interface mpls memif32321/32321 enable
mpls local-label add  4444 eos via 4.4.4.4 memif32321/32321 ip4-lookup-in-table 
100
bfd udp session add interface memif32321/32321 local-addr 4.4.4.5 peer-addr 
4.4.4.4 desired-min-tx 400000 required-min-rx 400000 detect-mult 3

VPP version: 20.09

(gdb) thread apply all bt

Thread 3 (Thread 0x7f7ac6ffe700 (LWP 422)):
#0  0x00007f7b67036ffe in vlib_worker_thread_barrier_check () at 
/home/supervisor/development/libvpp/src/vlib/threads.h:438
#1  0x00007f7b6703152e in vlib_main_or_worker_loop (vm=0x7f7b46cf3240, 
is_main=0) at /home/supervisor/development/libvpp/src/vlib/main.c:1788
#2  0x00007f7b67030d47 in vlib_worker_loop (vm=0x7f7b46cf3240) at 
/home/supervisor/development/libvpp/src/vlib/main.c:2008
#3  0x00007f7b6708892a in vlib_worker_thread_fn (arg=0x7f7b41f14540) at 
/home/supervisor/development/libvpp/src/vlib/threads.c:1862
#4  0x00007f7b668adc44 in clib_calljmp () at 
/home/supervisor/development/libvpp/src/vppinfra/longjmp.S:123
#5  0x00007f7ac6ffdec0 in ?? ()
#6  0x00007f7b67080ad3 in vlib_worker_thread_bootstrap_fn (arg=0x7f7b41f14540) 
at /home/supervisor/development/libvpp/src/vlib/threads.c:585
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 2 (Thread 0x7f7ac77ff700 (LWP 421)):
#0  0x00007f7b67036fef in vlib_worker_thread_barrier_check () at 
/home/supervisor/development/libvpp/src/vlib/threads.h:437
#1  0x00007f7b6703152e in vlib_main_or_worker_loop (vm=0x7f7b45fe8b80, 
is_main=0) at /home/supervisor/development/libvpp/src/vlib/main.c:1788
#2  0x00007f7b67030d47 in vlib_worker_loop (vm=0x7f7b45fe8b80) at 
/home/supervisor/development/libvpp/src/vlib/main.c:2008
#3  0x00007f7b6708892a in vlib_worker_thread_fn (arg=0x7f7b41f14440) at 
/home/supervisor/development/libvpp/src/vlib/threads.c:1862
#4  0x00007f7b668adc44 in clib_calljmp () at 
/home/supervisor/development/libvpp/src/vppinfra/longjmp.S:123
#5  0x00007f7ac77feec0 in ?? ()
#6  0x00007f7b67080ad3 in vlib_worker_thread_bootstrap_fn (arg=0x7f7b41f14440) 
at /home/supervisor/development/libvpp/src/vlib/threads.c:585
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 1 (Thread 0x7f7b739b7740 (LWP 226)):
#0  0x00007f7b681c952b in fib_node_list_remove (list=54, sibling=63) at 
/home/supervisor/development/libvpp/src/vnet/fib/fib_node_list.c:246
#1  0x00007f7b681c7695 in fib_node_child_remove (parent_type=FIB_NODE_TYPE_ADJ, 
parent_index=1, sibling_index=63)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_node.c:131
#2  0x00007f7b681b2395 in fib_walk_destroy (fwi=2) at 
/home/supervisor/development/libvpp/src/vnet/fib/fib_walk.c:262
#3  0x00007f7b681b2f13 in fib_walk_sync (parent_type=FIB_NODE_TYPE_ADJ, 
parent_index=1, ctx=0x7f7b2e08dc90)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_walk.c:818
#4  0x00007f7b6821ed4d in adj_nbr_update_rewrite_internal (adj=0x7f7b46e08c80, 
adj_next_index=IP_LOOKUP_NEXT_REWRITE, this_node=426,
    next_node=682, rewrite=0x7f7b4a5c4b40 
"z\001\277d\004\004zP\245d\004\004\210G")
    at /home/supervisor/development/libvpp/src/vnet/adj/adj_nbr.c:472
#5  0x00007f7b6821eb99 in adj_nbr_update_rewrite (adj_index=2, 
flags=ADJ_NBR_REWRITE_FLAG_COMPLETE,
    rewrite=0x7f7b4a5c4b40 "z\001\277d\004\004zP\245d\004\004\210G") at 
/home/supervisor/development/libvpp/src/vnet/adj/adj_nbr.c:335
#6  0x00007f7b67c1e02d in ip_neighbor_mk_complete (ai=2, ipn=0x7f7b476dd0d8)
    at 
/home/supervisor/development/libvpp/src/vnet/ip-neighbor/ip_neighbor.c:337
---Type <return> to continue, or q <return> to quit---
#7  0x00007f7b67c11683 in ip_neighbor_mk_complete_walk (ai=2, 
ctx=0x7f7b476dd0d8)
    at 
/home/supervisor/development/libvpp/src/vnet/ip-neighbor/ip_neighbor.c:364
#8  0x00007f7b68220063 in adj_nbr_walk_nh4 (sw_if_index=5, addr=0x7f7b476dcf60, 
cb=0x7f7b67c11660 <ip_neighbor_mk_complete_walk>,
    ctx=0x7f7b476dd0d8) at 
/home/supervisor/development/libvpp/src/vnet/adj/adj_nbr.c:626
#9  0x00007f7b682202f8 in adj_nbr_walk_nh (sw_if_index=5, 
adj_nh_proto=FIB_PROTOCOL_IP4, nh=0x7f7b476dcf54,
    cb=0x7f7b67c11660 <ip_neighbor_mk_complete_walk>, ctx=0x7f7b476dd0d8) at 
/home/supervisor/development/libvpp/src/vnet/adj/adj_nbr.c:677
#10 0x00007f7b67c127f0 in ip_neighbor_update (vnm=0x7f7b688133e8 <vnet_main>, 
ai=2)
    at 
/home/supervisor/development/libvpp/src/vnet/ip-neighbor/ip_neighbor.c:661
#11 0x00007f7b6788ec7d in ethernet_update_adjacency (vnm=0x7f7b688133e8 
<vnet_main>, sw_if_index=5, ai=2)
    at /home/supervisor/development/libvpp/src/vnet/ethernet/interface.c:219
#12 0x00007f7b68244bbf in vnet_update_adjacency_for_sw_interface 
(vnm=0x7f7b688133e8 <vnet_main>, sw_if_index=5, ai=2)
    at /home/supervisor/development/libvpp/src/vnet/adj/rewrite.c:187
#13 0x00007f7b6821e667 in adj_nbr_add_or_lock (nh_proto=FIB_PROTOCOL_IP4, 
link_type=VNET_LINK_MPLS, nh_addr=0x7f7b46d171e0, sw_if_index=5)
    at /home/supervisor/development/libvpp/src/vnet/adj/adj_nbr.c:270
#14 0x00007f7b681f33fb in fib_path_attached_next_hop_get_adj 
(path=0x7f7b46d171c8, link=VNET_LINK_MPLS, dpo=0x7f7b2e08e168)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_path.c:674
#15 0x00007f7b681f2ed0 in fib_path_contribute_forwarding (path_index=52, 
fct=FIB_FORW_CHAIN_TYPE_MPLS_NON_EOS, dpo=0x7f7b2e08e168)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_path.c:2475
#16 0x00007f7b681fb2b9 in fib_path_ext_stack (path_ext=0x7f7b435e7220, 
child_fct=FIB_FORW_CHAIN_TYPE_UNICAST_IP4,
    imp_null_fct=FIB_FORW_CHAIN_TYPE_UNICAST_IP4, nhs=0x7f7b4285f8d0) at 
/home/supervisor/development/libvpp/src/vnet/fib/fib_path_ext.c:241
#17 0x00007f7b681d11b3 in fib_entry_src_collect_forwarding (pl_index=50, 
path_index=52, arg=0x7f7b2e08e380)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_entry_src.c:476
#18 0x00007f7b681ec18d in fib_path_list_walk (path_list_index=50, 
func=0x7f7b681d1020 <fib_entry_src_collect_forwarding>, ctx=0x7f7b2e08e380)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_path_list.c:1408
#19 0x00007f7b681d0964 in fib_entry_src_mk_lb (fib_entry=0x7f7b46d38bf0, 
esrc=0x7f7b45a576d0, fct=FIB_FORW_CHAIN_TYPE_UNICAST_IP4,
    dpo_lb=0x7f7b46d38c18) at 
/home/supervisor/development/libvpp/src/vnet/fib/fib_entry_src.c:576
#20 0x00007f7b681d15f3 in fib_entry_src_action_install 
(fib_entry=0x7f7b46d38bf0, source=FIB_SOURCE_CLI)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_entry_src.c:706
#21 0x00007f7b681d251f in fib_entry_src_action_reactivate 
(fib_entry=0x7f7b46d38bf0, source=FIB_SOURCE_CLI)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_entry_src.c:1222
#22 0x00007f7b681cf4e2 in fib_entry_back_walk_notify (node=0x7f7b46d38bf0, 
ctx=0x7f7b2e08e668)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_entry.c:316
#23 0x00007f7b681c77e2 in fib_node_back_walk_one (ptr=0x7f7b2e08e688, 
ctx=0x7f7b2e08e668)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_node.c:161
#24 0x00007f7b681b228a in fib_walk_advance (fwi=1) at 
/home/supervisor/development/libvpp/src/vnet/fib/fib_walk.c:368
---Type <return> to continue, or q <return> to quit---
#25 0x00007f7b681b2e20 in fib_walk_sync (parent_type=FIB_NODE_TYPE_PATH_LIST, 
parent_index=50, ctx=0x7f7b2e08e828)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_walk.c:792
#26 0x00007f7b681e27b6 in fib_path_list_back_walk (path_list_index=50, 
ctx=0x7f7b2e08e828)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_path_list.c:500
#27 0x00007f7b681fa6d5 in fib_path_back_walk_notify (node=0x7f7b46d171c8, 
ctx=0x7f7b2e08e828)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_path.c:1226
#28 0x00007f7b681c77e2 in fib_node_back_walk_one (ptr=0x7f7b2e08e848, 
ctx=0x7f7b2e08e828)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_node.c:161
#29 0x00007f7b681b228a in fib_walk_advance (fwi=0) at 
/home/supervisor/development/libvpp/src/vnet/fib/fib_walk.c:368
#30 0x00007f7b681b2e20 in fib_walk_sync (parent_type=FIB_NODE_TYPE_ADJ, 
parent_index=1, ctx=0x7f7b2e08e8e0)
    at /home/supervisor/development/libvpp/src/vnet/fib/fib_walk.c:792
#31 0x00007f7b6824c340 in adj_bfd_update_walk (ai=1) at 
/home/supervisor/development/libvpp/src/vnet/adj/adj_bfd.c:105
#32 0x00007f7b6824b837 in adj_bfd_notify (event=BFD_LISTEN_EVENT_UPDATE, 
session=0x7f7b46ed6738)
    at /home/supervisor/development/libvpp/src/vnet/adj/adj_bfd.c:198
#33 0x00007f7b67c5a6ca in bfd_notify_listeners (bm=0x7f7b6881b7f0 <bfd_main>, 
event=BFD_LISTEN_EVENT_UPDATE, bs=0x7f7b46ed6738)
    at /home/supervisor/development/libvpp/src/vnet/bfd/bfd_main.c:450
#34 0x00007f7b67c628ea in bfd_rpc_notify_listeners_cb (a=0x130094fa9) at 
/home/supervisor/development/libvpp/src/vnet/bfd/bfd_main.c:617
#35 0x00007f7b68874754 in vl_api_rpc_call_t_handler (mp=0x130094f90) at 
/home/supervisor/development/libvpp/src/vlibmemory/vlib_api.c:531
#36 0x00007f7b6888b06f in vl_msg_api_handler_with_vm_node (am=0x7f7b68a9ed18 
<api_global_main>, vlib_rp=0x13002b000, the_msg=0x130094f90,
    vm=0x7f7b672f0c40 <vlib_global_main>, node=0x7f7b427fed80, is_private=0 
'\000')
    at /home/supervisor/development/libvpp/src/vlibapi/api_shared.c:635
#37 0x00007f7b68846c4b in vl_mem_api_handle_rpc (vm=0x7f7b672f0c40 
<vlib_global_main>, node=0x7f7b427fed80)
    at /home/supervisor/development/libvpp/src/vlibmemory/memory_api.c:746
#38 0x00007f7b68868797 in vl_api_clnt_process (vm=0x7f7b672f0c40 
<vlib_global_main>, node=0x7f7b427fed80, f=0x0)
    at /home/supervisor/development/libvpp/src/vlibmemory/vlib_api.c:337
#39 0x00007f7b670368ed in vlib_process_bootstrap (_a=140167380903912) at 
/home/supervisor/development/libvpp/src/vlib/main.c:1464
#40 0x00007f7b668adc44 in clib_calljmp () at 
/home/supervisor/development/libvpp/src/vppinfra/longjmp.S:123
#41 0x00007f7b42f2a7e0 in ?? ()
#42 0x00007f7b6703632f in vlib_process_startup (vm=0x11442f2a850, 
p=0x73f1147b8e1390, f=0x114)
    at /home/supervisor/development/libvpp/src/vlib/main.c:1489
#43 0x0000003546e0d9d0 in ?? ()
#44 0x000000000000001d in ?? ()
#45 0x0000003a00000001 in ?? ()
#46 0x000000000000001f in ?? ()
#47 0x00007f7b672f0c40 in ?? () from 
/home/supervisor/development/libvpp/build-root/install-vpp_debug-native/vpp/lib/libvlib.so.1.0.1
---Type <return> to continue, or q <return> to quit---
#48 0x00007f7b42f2a950 in ?? ()
#49 0x00007f7b688449e1 in memclnt_queue_callback (vm=<error reading variable: 
Cannot access memory at address 0xfffffffffffffff8>)
    at /home/supervisor/development/libvpp/src/vlibmemory/memory_api.c:110
Backtrace stopped: previous frame inner to this frame (corrupt stack?)
(gdb)

Thanks and Regards,
Sudhir

NOTICE TO RECIPIENT This e-mail message and any attachments are confidential 
and may be privileged. If you received this e-mail in error, any review, use, 
dissemination, distribution, or copying of this e-mail is strictly prohibited. 
Please notify us immediately of the error by return e-mail and please delete 
this message from your system. For more information about Rtbrick, please visit 
us at www.rtbrick.com<http://www.rtbrick.com>
-=-=-=-=-=-=-=-=-=-=-=-
Links: You receive all messages sent to this group.
View/Reply Online (#19554): https://lists.fd.io/g/vpp-dev/message/19554
Mute This Topic: https://lists.fd.io/mt/83418156/21656
Group Owner: vpp-dev+ow...@lists.fd.io
Unsubscribe: https://lists.fd.io/g/vpp-dev/unsub [arch...@mail-archive.com]
-=-=-=-=-=-=-=-=-=-=-=-

Reply via email to