Hi!

You can subscribe to the various mailing lists (and other things) via the
website [1]. The process is essentially to send an email to dev-subscribe
with "Subscribe" in the subject line.

You can concat chunks in any order, as long as they have the same schema. A
couple straight-forward ways might be:
- create a Table from many RecordBatches [2]
- create a Table from each chunk, then concatenate all of the tables [3]

I think there are still a variety of approaches and levels of abstraction
you can do it, but these are probably the highest and simplest.


[1]: https://arrow.apache.org/community/
[2]:
https://arrow.apache.org/docs/cpp/api/table.html#_CPPv4N5arrow5Table17FromRecordBatchesERKNSt6vectorINSt10shared_ptrI11RecordBatchEEEE
[3]:
https://arrow.apache.org/docs/cpp/api/table.html#_CPPv4N5arrow17ConcatenateTablesERKNSt6vectorINSt10shared_ptrI5TableEEEE24ConcatenateTablesOptionsP10MemoryPool

Aldrin Montana
Computer Science PhD Student
UC Santa Cruz


On Wed, Jun 29, 2022 at 9:53 AM L Ait <lhoussain.aitas...@gmail.com> wrote:

> Hi,
>
> I  would like to be added to the mailing list and would like it if there is
> some dedicated forum to ask some questions.
>
> I would like to integrate arrow cpp lib in an internal project.
> One question I would like to resolve  is it possible to concat several
> chunks as a stream ?
>
> Idea is when using erasure code , the objects are stored as chunks
>
> And I would like to apply a processing to these chunks when I get them (
> not waiting to build original file)
>
> I hope it is clear
>
> Thank you
>
> Lhoussain
>

Reply via email to