You can change the kind of log messages that are shown by
calling "context.setLogLevel(<level>)" with an appropriate level:
ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN.
See 
http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext@setLogLevel(logLevel:String):Unit
for further details.

Just one nitpick: when you say "I am also using spark standalone
mode and I don't submit jobs through command line. I just invoke
public static void main() of my driver program." are you
referring to spark local mode? It is possible to also run spark
applications in "distributed mode" (i.e. standalone, yarn or
mesos) just from the command line, however that will require
using spark's launcher interface and bundling your application in
a jar.

On Thu, Oct 6, 2016 at 9:27 AM, kant kodali <kanth...@gmail.com> wrote:
> How to Disable or do minimal Logging for apache spark client Driver program?
> I couldn't find this information on docs. By Driver program I mean the java
> program where I initialize spark context. It produces lot of INFO messages
> but I would like to know only when there is error or a Exception such as
> Nullpointer exception and so on. I am also using spark standalone mode and I
> don't submit jobs through command line. I just invoke public static void
> main() of my driver program.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to