If you are feeding a flowgraph a finite number of samples, there is no guarantee the last samples will be processed before the flowgraph terminates. Could that be what you're seeing? Otherwise, post what you're doing and someone can try to help further.
On Wed, Oct 27, 2021 at 3:00 PM Verónica Toro Betancur <[email protected]> wrote: > Hi Jeff, > > Thank you for your reply. > > I have tried returning len(output_items[0]) and using it in the consume > function and it still doesn't work. > > Also, if I don't use consume() or consume_each(), it seems like the last > part of the signal is dropped and I can't decode it correctly in the blocks > that come afterwards. > > > Best regards, > Verónica > > El mié., 27 de oct. de 2021 8:16 PM, Jeff Long <[email protected]> > escribió: > >> The input vector may contain more items than the scheduler is expecting >> you to return. Use len(output_items[0]) to determine how much to consume >> and return. For reference, here is the autogenerated code for a new module: >> >> def work(self, input_items, output_items): >> in0 = input_items[0] >> out = output_items[0] >> # <+signal processing here+> >> out[:] = in0 >> return len(output_items[0]) >> >> On Wed, Oct 27, 2021 at 11:46 AM Verónica Toro Betancur < >> [email protected]> wrote: >> >>> Hi, >>> >>> I've been trying to implement a sync python embedded block that >>> processes all input_items. At the end of the work() function, I call >>> >>> output_items[0][:] = input_items[0] >>> self.consume_each(len(input_items[0])) >>> return len(input_items[0]) >>> >>> This works well the first time and all data is processed correctly, but >>> then, the block stops working, i.e., it doesn't process any new upcoming >>> data and input_items[0] is always filled with zeros. >>> >>> I hope someone could help me with this. >>> >>> Thanks in advance. >>> >>> >>> Best regards, >>> Verónica >>> >>
