I firstly tried simply copy the codes below into spark application that has 
codes working with kafka streaming.
No any result can be printed into system console, then I tried spark shell, and 
it works. I can’t figured out the reason.

Best regards,

Cui Lin

From: Tobias Pfeiffer <t...@preferred.jp<mailto:t...@preferred.jp>>
Date: Tuesday, March 3, 2015 at 5:13 PM
To: Cui Lin <cui....@hds.com<mailto:cui....@hds.com>>
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>" 
<user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: Spark sql results can't be printed out to system console from 
spark streaming application

Hi,

can you explain how you copied that into your *streaming* application? Like, 
how do you issue the SQL, what data do you operate on, how do you view the logs 
etc.?

Tobias

On Wed, Mar 4, 2015 at 8:55 AM, Cui Lin 
<cui....@hds.com<mailto:cui....@hds.com>> wrote:


>Dear all,
>
>I found the below sample code can be printed out only in spark shell, but
>when I moved them into my spark streaming application, nothing can be
>printed out into system console. Can you explain why it happened? anything
>related to new spark context? Thanks a lot!
>
>
>val anotherPeopleRDD = sc_context.parallelize(
>  """{"name":"Yin","address":{"city":"Columbus","state":"Ohio"}}""" ::
>Nil)
>
>anotherPeopleRDD.toArray().foreach(line => System.out.println(line))
>
>val jsonMessage = sqlContext.jsonRDD(anotherPeopleRDD)
>
>jsonMessage.toArray().foreach(line => System.out.println(line))
>
>jsonMessage.registerTempTable("people")
>
>val test: SchemaRDD = sqlContext.sql("select count(*) from people")
>
>test.toArray().foreach(line => System.out.println(line))
>
>
>
>
>Best regards,
>
>Cui Lin
>
>
>
>
>


---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>


Reply via email to