I've measured time taken by convolutional decoding in gr-ieee802-11. The
module is using Punctured Convolutional Code class from IT++ library (
http://itpp.sourceforge.net/4.3.0/classitpp_1_1Punctured__Convolutional__Code.html
)

I've used chrono (chrono.h, chrono) to measure time taken. You can see how
I made it from the following page (
https://gist.github.com/gsongsong/7c4081f44e88a7f4407a#file-ofdm_decode_mac-cc-L252-L257
)

I've measure time with a loopback flow graph (w/o USRP;
examples/wifi_loopback.grc)

The result says that it takes from 5,000 to 30,000 us, which is 5 to 30 ms
to decode a signal with a length of 9,000 samples (samples are either 1 or
-1.)

* Test environment: Ubuntu 14.04 on VMWare, 2 CPUs and 4 GB RAM allocated
* Host environmetn: Windows 7 with i7-3770 3.7 GHz

Since I am not familiar with error correcting codes, I have no idea how
large the order of time taken is. But I think that one of the most
efficient decoding algorithm is Viterbi and that IT++ must use it.'

Then I  can deduce that CC decoding takes a quite long time even though the
algorithm (Viterbi) is very efficient. And is it a natural limitation of
software decoding and SDR?

Another question comes that, the commercial off the shelf (COTS) Wi-Fi
device achieves really high throughput and that must be based on super
faster CC decoding. Is that because COTS is using heaviliy optimized FPGA
and dedicated decoding chips?

Regards,
Jeon.
_______________________________________________
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio

Reply via email to