So, I was reading someone's python block that was coded as a general block, and 
wanted to work in blocks of X elements as input.  If it got less than X, it 
just said I consumed nothing, and just waited to be called back repeatedly till 
eventually it was given X or more elements.   Note X wasn't a particularly big 
number (I think 128?).

Seemed to work, but seemed perhaps risky to assume you'd eventually get X 
elements?  Yet this is a streaming environment, sooner or later more data comes?

I suppose another approach would be to have a local buffer in your block and 
just append to that buffer till you get X or more elements in the buffer, but 
then you have to do more copying and bookkeeping instead of just putting that 
burden onto the run-time like my previous example did.

Another approach might have used set_history() to X so that the previous X 
elements would always be presented so you would not have to buffer locally (?), 
but you would have to do the bookkeeping to fetch blocks of size X.

I think I'm going to experiment a bunch with these different approaches, but 
would appreciate some guidance on best practices...

Regards,
Dave

Reply via email to