- What you did that make the problem appear.   
launching 2 VMs on the same host. The host has ovs-dpdk and the VM ports are 
vhostuser. Generating bi-directional traffic (each VM sends to the other) using 
dpdk from within the VMs.   

   - What you expected to happen.   
Traffic will run with no issues   

   - What actually happened.   
The OVS gets stuck. See attached the relevant of the ovs-vswitchd.log   

   
   - The Open vSwitch version number (as output by ovs-vswitchd --version)   
2.5.0   

   - Any local patches or changes you have applied (if any).   
f519a72d9a3708fbc5f796f176e7c8bd3dcfb738   
   

The following are also handy sometimes:   
   - The kernel version on which Open vSwitch is running (from /proc/version) 
and the distribution and version number of your OS (e.g. "Centos 5.0").   
We are running the host on a Debian-8-based hlinux (HPE linux). kernel is based 
on 3.14.48 (gcc version 4.9.2 (Debian 4.9.2-10) ).   
The VM is running a centos 6 OS.   


We can supply more info if necessary, like our exact build process etc.
2016-04-20T11:54:50.785Z|00341|dpdk(vhost_thread2)|INFO|State of queue 0 ( 
tx_qid 0 ) of vhost device '/var/run/openvswitch/vhu9e6ec7f5-ac' 1 changed to 
'enabled'
2016-04-20T11:54:50.785Z|00342|dpdk(vhost_thread2)|INFO|State of queue 2 ( 
tx_qid 1 ) of vhost device '/var/run/openvswitch/vhu9e6ec7f5-ac' 1 changed to 
'disabled'
2016-04-20T11:54:50.785Z|00343|dpdk(vhost_thread2)|INFO|State of queue 4 ( 
tx_qid 2 ) of vhost device '/var/run/openvswitch/vhu9e6ec7f5-ac' 1 changed to 
'disabled'
2016-04-20T11:54:50.785Z|00344|dpdk(vhost_thread2)|INFO|State of queue 0 ( 
tx_qid 0 ) of vhost device '/var/run/openvswitch/vhu4e7e4a88-79' 2 changed to 
'enabled'
2016-04-20T11:54:50.786Z|00345|dpdk(vhost_thread2)|INFO|State of queue 6 ( 
tx_qid 3 ) of vhost device '/var/run/openvswitch/vhu9e6ec7f5-ac' 1 changed to 
'disabled'
2016-04-20T11:54:50.788Z|00346|dpdk(vhost_thread2)|INFO|State of queue 2 ( 
tx_qid 1 ) of vhost device '/var/run/openvswitch/vhu4e7e4a88-79' 2 changed to 
'disabled'
2016-04-20T11:54:50.788Z|00347|dpdk(vhost_thread2)|INFO|State of queue 4 ( 
tx_qid 2 ) of vhost device '/var/run/openvswitch/vhu4e7e4a88-79' 2 changed to 
'disabled'
2016-04-20T11:54:50.789Z|00348|dpdk(vhost_thread2)|INFO|State of queue 6 ( 
tx_qid 3 ) of vhost device '/var/run/openvswitch/vhu4e7e4a88-79' 2 changed to 
'disabled'
2016-04-20T11:54:50.900Z|00349|dpdk(vhost_thread2)|INFO|vHost Device 
'/var/run/openvswitch/vhu9e6ec7f5-ac' 1 has been added
2016-04-20T11:54:50.901Z|00350|dpdk(vhost_thread2)|INFO|State of queue 0 ( 
tx_qid 0 ) of vhost device '/var/run/openvswitch/vhu9e6ec7f5-ac' 1 changed to 
'enabled'
2016-04-20T11:54:50.902Z|00351|dpdk(vhost_thread2)|INFO|State of queue 2 ( 
tx_qid 1 ) of vhost device '/var/run/openvswitch/vhu9e6ec7f5-ac' 1 changed to 
'enabled'
2016-04-20T11:54:50.904Z|00352|dpdk(vhost_thread2)|INFO|State of queue 4 ( 
tx_qid 2 ) of vhost device '/var/run/openvswitch/vhu9e6ec7f5-ac' 1 changed to 
'disabled'
2016-04-20T11:54:50.905Z|00353|dpdk(vhost_thread2)|INFO|State of queue 6 ( 
tx_qid 3 ) of vhost device '/var/run/openvswitch/vhu9e6ec7f5-ac' 1 changed to 
'disabled'
2016-04-20T11:54:53.750Z|00366|poll_loop|INFO|Dropped 39 log messages in last 
3185 seconds (most recently, 3183 seconds ago) due to excessive rate
2016-04-20T11:54:53.750Z|00367|poll_loop|INFO|wakeup due to [POLLIN] on fd 34 
(character device /dev/net/tun) at lib/netdev-linux.c:1127 (83% CPU usage)
2016-04-20T11:54:53.750Z|00368|poll_loop|INFO|wakeup due to [POLLIN] on fd 33 
(character device /dev/net/tun) at lib/netdev-linux.c:1127 (83% CPU usage)
2016-04-20T11:54:53.750Z|00369|poll_loop|INFO|wakeup due to [POLLIN] on fd 34 
(character device /dev/net/tun) at lib/netdev-linux.c:1127 (83% CPU usage)
2016-04-20T11:54:53.750Z|00370|poll_loop|INFO|wakeup due to [POLLIN] on fd 33 
(character device /dev/net/tun) at lib/netdev-linux.c:1127 (83% CPU usage)
2016-04-20T11:54:53.750Z|00371|poll_loop|INFO|wakeup due to [POLLIN] on fd 34 
(character device /dev/net/tun) at lib/netdev-linux.c:1127 (83% CPU usage)
2016-04-20T11:54:53.750Z|00372|poll_loop|INFO|wakeup due to [POLLIN] on fd 33 
(character device /dev/net/tun) at lib/netdev-linux.c:1127 (83% CPU usage)
2016-04-20T11:54:53.750Z|00373|poll_loop|INFO|wakeup due to [POLLIN] on fd 34 
(character device /dev/net/tun) at lib/netdev-linux.c:1127 (83% CPU usage)
2016-04-20T11:54:53.750Z|00374|poll_loop|INFO|wakeup due to [POLLIN] on fd 33 
(character device /dev/net/tun) at lib/netdev-linux.c:1127 (83% CPU usage)
2016-04-20T11:54:53.750Z|00375|poll_loop|INFO|wakeup due to [POLLIN] on fd 34 
(character device /dev/net/tun) at lib/netdev-linux.c:1127 (83% CPU usage)
2016-04-20T11:54:53.750Z|00376|poll_loop|INFO|wakeup due to [POLLIN] on fd 33 
(character device /dev/net/tun) at lib/netdev-linux.c:1127 (83% CPU usage)
2016-04-20T11:55:00.057Z|00377|poll_loop|INFO|Dropped 256626 log messages in 
last 7 seconds (most recently, 1 seconds ago) due to excessive rate
2016-04-20T11:55:00.057Z|00378|poll_loop|INFO|wakeup due to [POLLIN] on fd 138 
(FIFO pipe:[56788120]) at lib/dpif-netdev.c:2663 (69% CPU usage)
2016-04-20T11:57:26.087Z|00001|ovs_rcu(urcu3)|WARN|blocked 1000 ms waiting for 
pmd101 to quiesce
2016-04-20T11:57:27.088Z|00002|ovs_rcu(urcu3)|WARN|blocked 2001 ms waiting for 
pmd101 to quiesce
2016-04-20T11:57:29.088Z|00003|ovs_rcu(urcu3)|WARN|blocked 4001 ms waiting for 
pmd101 to quiesce
2016-04-20T11:57:33.087Z|00004|ovs_rcu(urcu3)|WARN|blocked 8000 ms waiting for 
pmd101 to quiesce
2016-04-20T11:57:41.088Z|00005|ovs_rcu(urcu3)|WARN|blocked 16001 ms waiting for 
pmd101 to quiesce
2016-04-20T11:57:56.972Z|00379|ovs_rcu|WARN|blocked 1000 ms waiting for pmd101 
to quiesce
2016-04-20T11:57:57.087Z|00006|ovs_rcu(urcu3)|WARN|blocked 32000 ms waiting for 
pmd101 to quiesce
2016-04-20T11:57:57.972Z|00380|ovs_rcu|WARN|blocked 2000 ms waiting for pmd101 
to quiesce
2016-04-20T11:57:59.973Z|00381|ovs_rcu|WARN|blocked 4000 ms waiting for pmd101 
to quiesce
2016-04-20T11:58:03.972Z|00382|ovs_rcu|WARN|blocked 8000 ms waiting for pmd101 
to quiesce
2016-04-20T11:58:11.973Z|00383|ovs_rcu|WARN|blocked 16001 ms waiting for pmd101 
to quiesce
2016-04-20T11:58:27.973Z|00384|ovs_rcu|WARN|blocked 32001 ms waiting for pmd101 
to quiesce
2016-04-20T11:58:29.087Z|00007|ovs_rcu(urcu3)|WARN|blocked 64000 ms waiting for 
pmd101 to quiesce
2016-04-20T11:58:59.972Z|00385|ovs_rcu|WARN|blocked 64000 ms waiting for pmd101 
to quiesce
2016-04-20T11:59:33.088Z|00008|ovs_rcu(urcu3)|WARN|blocked 128001 ms waiting 
for pmd101 to quiesce
2016-04-20T12:00:03.973Z|00386|ovs_rcu|WARN|blocked 128001 ms waiting for 
pmd101 to quiesce
2016-04-20T12:00:44.794Z|00354|timeval(vhost_thread2)|WARN|Unreasonably long 
1382911ms poll interval (17ms user, 38ms system)
2016-04-20T12:00:44.794Z|00355|timeval(vhost_thread2)|WARN|context switches: 
1467 voluntary, 3 involuntary
2016-04-20T12:00:44.794Z|00356|coverage(vhost_thread2)|INFO|Event coverage, avg 
rate over last: 5 seconds, last minute, last hour,  hash=5a386cce:
2016-04-20T12:00:44.794Z|00357|coverage(vhost_thread2)|INFO|bridge_reconfigure  
       0.0/sec     0.000/sec        0.0364/sec   total: 131
2016-04-20T12:00:44.794Z|00358|coverage(vhost_thread2)|INFO|ofproto_flush       
       0.0/sec     0.000/sec        0.0006/sec   total: 2
2016-04-20T12:00:44.794Z|00359|coverage(vhost_thread2)|INFO|ofproto_recv_openflow
      0.0/sec     0.000/sec        1.4658/sec   total: 5277
2016-04-20T12:00:44.794Z|00360|coverage(vhost_thread2)|INFO|ofproto_update_port 
       0.0/sec     0.000/sec        0.0339/sec   total: 122
2016-04-20T12:00:44.794Z|00361|coverage(vhost_thread2)|INFO|ofproto_dpif_expired
       0.0/sec     0.000/sec        0.0047/sec   total: 17
2016-04-20T12:00:44.794Z|00362|coverage(vhost_thread2)|INFO|rev_reconfigure     
       0.0/sec     0.000/sec        0.0544/sec   total: 196
2016-04-20T12:00:44.794Z|00363|coverage(vhost_thread2)|INFO|rev_port_toggled    
       0.0/sec     0.000/sec        0.0025/sec   total: 9
2016-04-20T12:00:44.794Z|00364|coverage(vhost_thread2)|INFO|rev_flow_table      
       0.0/sec     0.000/sec        0.0269/sec   total: 97
2016-04-20T12:00:44.794Z|00365|coverage(vhost_thread2)|INFO|rev_mac_learning    
       0.0/sec     0.000/sec        0.1275/sec   total: 459
2016-04-20T12:00:44.794Z|00366|coverage(vhost_thread2)|INFO|dumped_new_flow     
       0.0/sec     0.300/sec        0.3372/sec   total: 1214
2016-04-20T12:00:44.794Z|00367|coverage(vhost_thread2)|INFO|xlate_actions       
       0.0/sec     0.050/sec        4.3350/sec   total: 15606
2016-04-20T12:00:44.794Z|00368|coverage(vhost_thread2)|INFO|cmap_expand         
       0.0/sec     0.000/sec        0.0478/sec   total: 172
2016-04-20T12:00:44.794Z|00369|coverage(vhost_thread2)|INFO|cmap_shrink         
       0.0/sec     0.017/sec        0.0439/sec   total: 158
2016-04-20T12:00:44.794Z|00370|coverage(vhost_thread2)|INFO|dpif_port_add       
       0.0/sec     0.000/sec        0.0097/sec   total: 35
2016-04-20T12:00:44.794Z|00371|coverage(vhost_thread2)|INFO|dpif_port_del       
       0.0/sec     0.000/sec        0.0053/sec   total: 19
2016-04-20T12:00:44.794Z|00372|coverage(vhost_thread2)|INFO|dpif_flow_flush     
       0.0/sec     0.000/sec        0.0014/sec   total: 5
2016-04-20T12:00:44.794Z|00373|coverage(vhost_thread2)|INFO|dpif_flow_get       
       0.0/sec     0.000/sec        0.0019/sec   total: 7
2016-04-20T12:00:44.794Z|00374|coverage(vhost_thread2)|INFO|dpif_flow_put       
       0.0/sec     0.000/sec        0.0275/sec   total: 99
2016-04-20T12:00:44.794Z|00375|coverage(vhost_thread2)|INFO|dpif_flow_del       
       0.0/sec     0.267/sec        0.3369/sec   total: 1217
2016-04-20T12:00:44.794Z|00376|coverage(vhost_thread2)|INFO|dpif_execute        
       0.0/sec     0.000/sec        0.0025/sec   total: 9
2016-04-20T12:00:44.794Z|00377|coverage(vhost_thread2)|INFO|flow_extract        
       0.0/sec     0.000/sec        0.0017/sec   total: 6
2016-04-20T12:00:44.794Z|00378|coverage(vhost_thread2)|INFO|miniflow_malloc     
       0.0/sec     0.000/sec        1.8414/sec   total: 6629
2016-04-20T12:00:44.794Z|00379|coverage(vhost_thread2)|INFO|hmap_pathological   
       0.0/sec     0.000/sec        1.9783/sec   total: 7122
2016-04-20T12:00:44.794Z|00380|coverage(vhost_thread2)|INFO|hmap_expand         
       0.0/sec     0.000/sec      399.6108/sec   total: 1438599
2016-04-20T12:00:44.794Z|00381|coverage(vhost_thread2)|INFO|mac_learning_learned
       0.0/sec     0.000/sec        0.0186/sec   total: 67
2016-04-20T12:00:44.794Z|00382|coverage(vhost_thread2)|INFO|mac_learning_expired
       0.0/sec     0.000/sec        0.0100/sec   total: 36
2016-04-20T12:00:44.794Z|00383|coverage(vhost_thread2)|INFO|netdev_received     
       0.0/sec     0.000/sec      430.1486/sec   total: 1548535
2016-04-20T12:00:44.794Z|00384|coverage(vhost_thread2)|INFO|netdev_sent         
       0.0/sec     0.000/sec      617.3522/sec   total: 2222468
2016-04-20T12:00:44.794Z|00385|coverage(vhost_thread2)|INFO|netdev_get_stats    
       0.0/sec     0.000/sec        4.5992/sec   total: 16557
2016-04-20T12:00:44.794Z|00386|coverage(vhost_thread2)|INFO|txn_unchanged       
       0.0/sec     0.000/sec        0.0375/sec   total: 135
2016-04-20T12:00:44.794Z|00387|coverage(vhost_thread2)|INFO|txn_incomplete      
       0.0/sec     0.000/sec        0.2675/sec   total: 963
2016-04-20T12:00:44.794Z|00388|coverage(vhost_thread2)|INFO|txn_success         
       0.0/sec     0.000/sec        0.2069/sec   total: 745
2016-04-20T12:00:44.794Z|00389|coverage(vhost_thread2)|INFO|poll_create_node    
     378.4/sec   281.500/sec     3207.0108/sec   total: 11547031
2016-04-20T12:00:44.794Z|00390|coverage(vhost_thread2)|INFO|poll_zero_timeout   
       0.0/sec     0.000/sec        0.1747/sec   total: 629
2016-04-20T12:00:44.794Z|00391|coverage(vhost_thread2)|INFO|rconn_queued        
       0.0/sec     0.000/sec        0.9664/sec   total: 3479
2016-04-20T12:00:44.794Z|00392|coverage(vhost_thread2)|INFO|rconn_sent          
       0.0/sec     0.000/sec        0.9664/sec   total: 3479
2016-04-20T12:00:44.794Z|00393|coverage(vhost_thread2)|INFO|seq_change          
     1087.8/sec  1071.067/sec     3853.4842/sec   total: 13877911
2016-04-20T12:00:44.794Z|00394|coverage(vhost_thread2)|INFO|pstream_open        
       0.0/sec     0.000/sec        0.0031/sec   total: 11
2016-04-20T12:00:44.794Z|00395|coverage(vhost_thread2)|INFO|stream_open         
       0.0/sec     0.000/sec        0.0003/sec   total: 1
2016-04-20T12:00:44.794Z|00396|coverage(vhost_thread2)|INFO|unixctl_received    
       0.0/sec     0.000/sec        0.0006/sec   total: 2
2016-04-20T12:00:44.794Z|00397|coverage(vhost_thread2)|INFO|unixctl_replied     
       0.0/sec     0.000/sec        0.0006/sec   total: 2
2016-04-20T12:00:44.794Z|00398|coverage(vhost_thread2)|INFO|util_xalloc         
     807.0/sec   613.533/sec     6357.0656/sec   total: 22889418
2016-04-20T12:00:44.794Z|00399|coverage(vhost_thread2)|INFO|vconn_received      
       0.0/sec     0.000/sec        1.9536/sec   total: 7033
2016-04-20T12:00:44.794Z|00400|coverage(vhost_thread2)|INFO|vconn_sent          
       0.0/sec     0.000/sec        1.4542/sec   total: 5235
2016-04-20T12:00:44.794Z|00401|coverage(vhost_thread2)|INFO|netdev_set_policing 
       0.0/sec     0.000/sec        0.0181/sec   total: 65
2016-04-20T12:00:44.794Z|00402|coverage(vhost_thread2)|INFO|netdev_get_ifindex  
       0.0/sec     0.000/sec        0.0097/sec   total: 35
2016-04-20T12:00:44.794Z|00403|coverage(vhost_thread2)|INFO|netdev_get_hwaddr   
       0.0/sec     0.000/sec        0.0439/sec   total: 158
2016-04-20T12:00:44.794Z|00404|coverage(vhost_thread2)|INFO|netdev_set_hwaddr   
       0.0/sec     0.000/sec        0.0011/sec   total: 4
2016-04-20T12:00:44.794Z|00405|coverage(vhost_thread2)|INFO|netdev_get_ethtool  
       0.0/sec     0.000/sec        0.1247/sec   total: 449
2016-04-20T12:00:44.794Z|00406|coverage(vhost_thread2)|INFO|netlink_received    
       0.0/sec     0.000/sec        2.4711/sec   total: 8896
2016-04-20T12:00:44.794Z|00407|coverage(vhost_thread2)|INFO|netlink_recv_jumbo  
       0.0/sec     0.000/sec        1.2544/sec   total: 4516
2016-04-20T12:00:44.794Z|00408|coverage(vhost_thread2)|INFO|netlink_sent        
       0.0/sec     0.000/sec        2.4086/sec   total: 8671
2016-04-20T12:00:44.794Z|00409|coverage(vhost_thread2)|INFO|nln_changed         
       0.0/sec     0.000/sec        0.0233/sec   total: 84
2016-04-20T12:00:44.794Z|00410|coverage(vhost_thread2)|INFO|47 events never hit
2016-04-20T12:00:45.794Z|00411|ovs_rcu(vhost_thread2)|WARN|blocked 1000 ms 
waiting for pmd101 to quiesce
2016-04-20T12:00:46.795Z|00412|ovs_rcu(vhost_thread2)|WARN|blocked 2000 ms 
waiting for pmd101 to quiesce
2016-04-20T12:00:48.795Z|00413|ovs_rcu(vhost_thread2)|WARN|blocked 4001 ms 
waiting for pmd101 to quiesce
2016-04-20T12:00:52.795Z|00414|ovs_rcu(vhost_thread2)|WARN|blocked 8001 ms 
waiting for pmd101 to quiesce
2016-04-20T12:01:00.795Z|00415|ovs_rcu(vhost_thread2)|WARN|blocked 16001 ms 
waiting for pmd101 to quiesce
2016-04-20T12:01:16.795Z|00416|ovs_rcu(vhost_thread2)|WARN|blocked 32000 ms 
waiting for pmd101 to quiesce
2016-04-20T12:01:41.087Z|00009|ovs_rcu(urcu3)|WARN|blocked 256000 ms waiting 
for pmd101 to quiesce
2016-04-20T12:01:48.795Z|00417|ovs_rcu(vhost_thread2)|WARN|blocked 64001 ms 
waiting for pmd101 to quiesce
2016-04-20T12:02:11.972Z|00387|ovs_rcu|WARN|blocked 256000 ms waiting for 
pmd101 to quiesce
2016-04-20T12:02:27.022Z|00418|dpdk(vhost_thread2)|INFO|vHost Device 
'/var/run/openvswitch/vhu49c5db58-4f' 0 has been removed
2016-04-20T12:02:27.199Z|00419|dpdk(vhost_thread2)|INFO|vHost Device 
'/var/run/openvswitch/vhu9e6ec7f5-ac' 1 has been removed
_______________________________________________
discuss mailing list
discuss@openvswitch.org
http://openvswitch.org/mailman/listinfo/discuss

Reply via email to