Great success!

I was able to get output to the driver console by changing the construction
of the Streaming Spark Context from:

 val ssc = new StreamingContext("local" /**TODO change once a cluster is up
**/,
        "AppName", Seconds(1))


to:

val ssc = new StreamingContext("local[2]" /**TODO change once a cluster is
up **/,
        "AppName", Seconds(1))


I found something that tipped me off that this might work by digging
through this mailing list.


On Sun, Jul 13, 2014 at 2:00 PM, Walrus theCat <walrusthe...@gmail.com>
wrote:

> More strange behavior:
>
> lines.foreachRDD(x => println(x.first)) // works
> lines.foreachRDD(x => println((x.count,x.first))) // no output is printed
> to driver console
>
>
>
>
> On Sun, Jul 13, 2014 at 11:47 AM, Walrus theCat <walrusthe...@gmail.com>
> wrote:
>
>>
>> Thanks for your interest.
>>
>> lines.foreachRDD(x => println(x.count))
>>
>> And I got 0 every once in a while (which I think is strange, because
>> lines.print prints the input I'm giving it over the socket.)
>>
>>
>> When I tried:
>>
>> lines.map(_->1).reduceByKey(_+_).foreachRDD(x => println(x.count))
>>
>> I got no count.
>>
>> Thanks
>>
>>
>> On Sun, Jul 13, 2014 at 11:34 AM, Tathagata Das <
>> tathagata.das1...@gmail.com> wrote:
>>
>>> Try doing DStream.foreachRDD and then printing the RDD count and further
>>> inspecting the RDD.
>>>  On Jul 13, 2014 1:03 AM, "Walrus theCat" <walrusthe...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> I have a DStream that works just fine when I say:
>>>>
>>>> dstream.print
>>>>
>>>> If I say:
>>>>
>>>> dstream.map(_,1).print
>>>>
>>>> that works, too.  However, if I do the following:
>>>>
>>>> dstream.reduce{case(x,y) => x}.print
>>>>
>>>> I don't get anything on my console.  What's going on?
>>>>
>>>> Thanks
>>>>
>>>
>>
>

Reply via email to