Hi, As I'm still planning how to do to transport my data, I can use Thrift as well, but how does Thrift and Avro integrates? Avro seems very interesting to serialize data and also pretty straightforward (define a schema -> transform data -> read data again using the same schema). It's not clear to me the role of Thrift (ok it will generate code to make the RPC call) and the integration with Avro. Can you please add more details? Regards. Seba
On Mon, Aug 25, 2014 at 11:26 PM, Hari Shreedharan < hshreedha...@cloudera.com> wrote: > Can you use thrift? That is the recommended mechanism for non-java > applications to Flume > > Sebastiano Di Paola wrote: > > > Hi there, > I'm trying to write a small pice of code using Avro C api and then use > a Flume avro sink to write the collected and serialized data to hadoop > HDFS. > > I would like to use Avro C api as all the other lib/code I'm using to > generate/collect my data is in c language and it's already written. So > I'm adding Avro output to an already existing software. > > As I'm reading from avro C API does not support RPC call yet. > > So Is there a way to seamless integrate a Flume avro source with avro > C api? > Is there any already written example? > > Or I have to create my avro file with avro C api (i.e. > my_data_file.avro) on the file system > and then use the already provided avro client > bin/flume-ng avro-client -H localhost -p 41414 -F my_data_file.avro > in order to have the file "read" through the Flume avro source and > sent to the Flume sink on hdfs? > > Isn't it possible only to configure the Flume Avro source to listen on > a particulary port and then use Avro C api to send the message? > > (I did a quick search on avro C api, but i couln't find any function I > could call to answer my previous question) > > Kind regard. > Seba > >