Hi,
I am testing Spark Streaming (local mode, with Kafka). The code is as
follows:
public class LocalStreamTest2 {
public static void main(String[] args) {
JavaSparkContext sc = new JavaSparkContext("local[4]", "Local Stream Test");
JavaStreamingContext ssc = new JavaStreamingContext(sc, new D
AsTextFiles, for example, if you're saving to the file system as
> strings. You can also dump the DStream to a DB -- there are samples on this
> list (you'd have to do a combo of foreachRDD and mapPartitions, likely)
>
> On Wed, Oct 1, 2014 at 3:49 AM, Chia-Chun Shih
> wrote:
t you seek is what happens "out of the box" (unless I'm
> misunderstanding the question)
>
> On Wed, Oct 1, 2014 at 4:13 AM, Chia-Chun Shih
> wrote:
>
>> Hi,
>>
>> Are there any code examples demonstrating spark streaming applications
>> which depend on states? That is, last-run *updateStateByKey* results are
>> used as inputs.
>>
>> Thanks.
>>
>>
>>
>>
>>
>>
>
Hi,
Are there any code examples demonstrating spark streaming applications
which depend on states? That is, last-run *updateStateByKey* results are
used as inputs.
Thanks.
Hi,
My application is to digest user logs and deduct user quotas. I need to
maintain latest states of user quotas persistently, so that latest user
quotas will not be lost.
I have tried *updateStateByKey* to generate and a DStream for user quotas
and called *persist(StorageLevel.MEMORY_AND_DISK()