I would use Sqoop. It has been designed exactly for these types of
scenarios. Spark streaming does not make sense here
Le dim. 5 juil. 2015 à 1:59, ayan guha a écrit :
> Hi All
>
> I have a requireent to connect to a DB every few minutes and bring data to
> HBase. Can anyone suggest if spark str
ake up the rdds in your DStream. So,
>>>> intervals, windowing, etc. apply to those. The receiver is the boundary
>>>> between your data source and the DStream RDDs. In other words, if your
>>>> interval is 15 seconds with no windowing, then the things that went to
o make up the rdds in your DStream. So,
>>> intervals, windowing, etc. apply to those. The receiver is the boundary
>>> between your data source and the DStream RDDs. In other words, if your
>>> interval is 15 seconds with no windowing, then the things that went to
&g
source and the DStream RDDs. In other words, if your
>> interval is 15 seconds with no windowing, then the things that went to
>> store() every 15 seconds are bunched up into an RDD of your DStream. That's
>> kind of a simplification, but should give you the idea that your "db
; store() every 15 seconds are bunched up into an RDD of your DStream. That's
> kind of a simplification, but should give you the idea that your "db
> polling" interval and streaming interval are not tied together.
>
> -Ashic.
>
> --
&g
erval and streaming interval are not tied together.
-Ashic.
Date: Mon, 6 Jul 2015 01:12:34 +1000
Subject: Re: JDBC Streams
From: guha.a...@gmail.com
To: as...@live.com
CC: ak...@sigmoidanalytics.com; user@spark.apache.org
Hi
Thanks for the reply. here is my situation: I hve a DB which enbles synchron
is enough!
>
> --
> Date: Sun, 5 Jul 2015 22:48:37 +1000
> Subject: Re: JDBC Streams
> From: guha.a...@gmail.com
> To: ak...@sigmoidanalytics.com
> CC: user@spark.apache.org
>
>
> Thanks Akhil. In case I go with spark streaming, I guess
cronjob
is enough!
Date: Sun, 5 Jul 2015 22:48:37 +1000
Subject: Re: JDBC Streams
From: guha.a...@gmail.com
To: ak...@sigmoidanalytics.com
CC: user@spark.apache.org
Thanks Akhil. In case I go with spark streaming, I guess I have to implment a
custom receiver and spark streaming will call this re
Thanks Akhil. In case I go with spark streaming, I guess I have to implment
a custom receiver and spark streaming will call this receiver every batch
interval, is that correct? Any gotcha you see in this plan? TIA...Best, Ayan
On Sun, Jul 5, 2015 at 5:40 PM, Akhil Das
wrote:
> If you want a long
If you want a long running application, then go with spark streaming (which
kind of blocks your resources). On the other hand, if you use job server
then you can actually use the resources (CPUs) for other jobs also when
your dbjob is not using them.
Thanks
Best Regards
On Sun, Jul 5, 2015 at 5:2
10 matches
Mail list logo