got it! Thanks!





On Fri, Oct 7, 2016 12:41 PM, Jakob Odersky ja...@odersky.com
wrote:
Hi Kant,

job submission through the command line is not strictly required,

although it is the most common way (it's flexible and easy to use) in

which applications that depend on spark are run. The shell script

"spark-submit" ends up doing similar things to what your code snippet

shows.




I asked if you meant "local" mode when you wrote "I just invoke public

static void main() of my driver program" because I have seen people

confuse "local" and "standalone" in the past.




--Jakob







On Thu, Oct 6, 2016 at 10:30 PM, kant kodali <kanth...@gmail.com> wrote:

Hi Jakob,



It is a biggest question for me too since I seem to be on a different page

than everyone else whenever I say "I am also using spark standalone mode

and I don't submit jobs through command line. I just invoke public static

void main() of my driver program"



Everyone keeps talking about submit jobs from command line or even the words

"submit job" people automatically assume it is happening from command line.

I just setup a standalone cluster and do this











SparkConf sparkConf = config.buildSparkConfig();

sparkConf.setJars(JavaSparkContext.jarOfClass(SparkDriver.class));

JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new

Duration(config.getSparkStremingBatchInterval()));

ssc.sparkContext().setLogLevel("ERROR");

Receiver receiver = new Receiver(config);

JavaReceiverInputDStream<String> jsonMessagesDStream =

ssc.receiverStream(receiver);

jsonMessagesDStream.count()

ssc.start();

ssc.awaitTermination();





Not using Mixmax yet?





so I assume submitting Job happens through this API. please correct me if I

am wrong.



Thanks





On Thu, Oct 6, 2016 1:38 PM, Jakob Odersky ja...@odersky.com wrote:



You can change the kind of log messages that are shown by



calling "context.setLogLevel(<level>)" with an appropriate level:



ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN.



See


http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext@setLogLevel(logLevel:String):Unit



for further details.





Just one nitpick: when you say "I am also using spark standalone



mode and I don't submit jobs through command line. I just invoke



public static void main() of my driver program." are you



referring to spark local mode? It is possible to also run spark



applications in "distributed mode" (i.e. standalone, yarn or



mesos) just from the command line, however that will require



using spark's launcher interface and bundling your application in



a jar.





On Thu, Oct 6, 2016 at 9:27 AM, kant kodali <kanth...@gmail.com> wrote:



> How to Disable or do minimal Logging for apache spark client Driver

> program?



> I couldn't find this information on docs. By Driver program I mean the

> java



> program where I initialize spark context. It produces lot of INFO

> messages



> but I would like to know only when there is error or a Exception such as



> Nullpointer exception and so on. I am also using spark standalone mode

> and I



> don't submit jobs through command line. I just invoke public static void



> main() of my driver program.





Reply via email to