Hi List,

Small update: Problem "resolved". In quote marks because though I have
a version that works consistently, I still have no idea why the
previous version didn't work and through what mechanism this was
causing the ADI to fail ...


A bit more detail:

The custom logic is essentially a sort of correlator and deep inside
it, has several accumulators in it. With some controls can
load/add/substract data to them. Originally those were generated
inside a 'generate loop' inside the verilog of a larger module and
written fairly naively. Their logic is simple enough that it can fit
in a single layer of LUT6, but when first trying to synthesize, the
fact that it was operating on arrays or arrays of bit was apparently
confusing XST ( for instance it in the synthesis report, it was
showing a bunch of 1 bit registers instead of a 30 bit register and
stuff like that ). The placer was also doing random stuff and placing
things at opposite ends of the FPGA, resulting in slow build time and
not meeting timing and using too much resources. So I rewrote that
part trying to be "clever" and explicitely making the logic equations
that I was expecting ISE to use and pack in the LUT6 and that sort of
worked. Logic size was reduced and it was meeting timing easily.

But of course that was the bitstream where the ADI failed to initialize ...

Now through _many_ and _many_ builds progressively removing / adding
logic, I managed to isolate that failing behavior to that particular
part I had rewritten. And among all of that, I also ended up isolating
that part in a separate verilog module instead of creating it directly
in the upper layer module. Then following a hunch of Brian Padalino, I
rewrote that sub module and went back to the naive verilog description
of its behavior. And now that it was in a separate verilog module and
no longer operating on arrays of wires, XST actually did the right
thing.

And, much to my surprise, that bitstream worked.

Now you might think this is a random timing thing but it seems really
consistent. I can vary the number of compute units in my logic
(increasing logic size accordingly), I can also enable/disable Radio 1
(other channel of the B210) or include/remove chipscope. All those
variations have no impact. A build using my old code for that
submodule fails to init the ADI and a a build using the new code for
that sub module will work. I did dozens of builds, always the same
results.

That block is buried deep in my custom logic, it is nowhere near any
clock crossing, it's running off bus_clk. It's a pure datapath and has
no influence on any of the control logic ... I have _NO_IDEA_ how it
could have any influence on the ADI initialization.
I probed the SPI bus in the working/non-working case, couldn't see a
difference. I looked at the supply voltages, couldn't see a difference
either. Honestly even if ISE was screwing up completely and corrupting
other part of the logic, I don't see how the ADI would not respond
given the externally observed SPI lines look good.



Cheers,

    Sylvain Munaut

_______________________________________________
USRP-users mailing list
USRP-users@lists.ettus.com
http://lists.ettus.com/mailman/listinfo/usrp-users_lists.ettus.com

Reply via email to