Hi

At this moment we have the same requirement.  Unfortunately, database
owners will not be able to push to a msg queue but they have enabled Oracle
CDC which synchronously update a replica of production DB. Our task will be
query the replica and create msg streams to Kinesis. There is already an
event processor listening to Kinesis.

I am toying around the ideas a) Build a custom receiver OR b) run a simple
spark job ever 2 minutes.

Any suggestion?

Best
Ayan

On Tue, Jul 14, 2015 at 5:47 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Why not add a trigger to your database table and whenever its updated push
> the changes to kafka etc and use normal sparkstreaming? You can also write
> a receiver based architecture
> <https://spark.apache.org/docs/latest/streaming-custom-receivers.html>
> for this, but that will be a bit time consuming. Another approach would be
> to use normal spark job which will be triggered whenever there's a change
> in your DB tables.
>
> Thanks
> Best Regards
>
> On Mon, Jul 13, 2015 at 9:43 PM, unk1102 <umesh.ka...@gmail.com> wrote:
>
>> Hi I did Kafka streaming through Spark streaming I have a use case where I
>> would like to stream data from a database table. I see JDBCRDD is there
>> but
>> that is not what I am looking for I need continuous streaming like
>> JavaSparkStreaming which continuously runs and listens to changes in a
>> database table and gives me changes to process and store in HDFS. Please
>> guide I am new to Spark. Thank in advance.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Does-Spark-Streaming-support-streaming-from-a-database-table-tp23801.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


-- 
Best Regards,
Ayan Guha

Reply via email to