The easiest way to control logging in spark shell is to run Logger.setLevel
commands at the beginning of your program
e.g.

org.apache.log4j.Logger.getLogger("com.amazon").setLevel(org.apache.log4j.Level.WARN)
org.apache.log4j.Logger.getLogger("com.amazonaws").setLevel(org.apache.log4j.Level.WARN)
org.apache.log4j.Logger.getLogger("amazon.emr").setLevel(org.apache.log4j.Level.WARN)
org.apache.log4j.Logger.getLogger("akka").setLevel(org.apache.log4j.Level.WARN)

On Tue, Oct 6, 2015 at 10:50 AM, Alex Kozlov <ale...@gmail.com> wrote:

> Try
>
> JAVA_OPTS='-Dlog4j.configuration=file:/<path-to-log4j.properties>'
>
> Internally, this is just spark.driver.extraJavaOptions, which you should
> be able to set in conf/spark-defaults.conf
>
> Can you provide more details how you invoke the driver?
>
> On Tue, Oct 6, 2015 at 9:48 AM, Jeff Jones <jjo...@adaptivebiotech.com>
> wrote:
>
>> Thanks. Any chance you know how to pass this to a Scala app that is run
>> via TypeSafe activator?
>>
>> I tried putting it $JAVA_OPTS but I get:
>>
>> Unrecognized option: --driver-java-options
>>
>> Error: Could not create the Java Virtual Machine.
>>
>> Error: A fatal exception has occurred. Program will exit.
>>
>>
>> I tried a bunch of different quoting but nothing produced a good result.
>> I also tried passing it directly to activator using –jvm but it still
>> produces the same results with verbose logging. Is there a way I can tell
>> if it’s picking up my file?
>>
>>
>>
>> From: Alex Kozlov
>> Date: Monday, October 5, 2015 at 8:34 PM
>> To: Jeff Jones
>> Cc: "user@spark.apache.org"
>> Subject: Re: How can I disable logging when running local[*]?
>>
>> Did you try “--driver-java-options
>> '-Dlog4j.configuration=file:/<path-to-log4j.properties>'” and setting the
>> log4j.rootLogger=FATAL,console?
>>
>> On Mon, Oct 5, 2015 at 8:19 PM, Jeff Jones <jjo...@adaptivebiotech.com>
>> wrote:
>>
>>> I’ve written an application that hosts the Spark driver in-process using
>>> “local[*]”. I’ve turned off logging in my conf/log4j.properties file. I’ve
>>> also tried putting the following code prior to creating my SparkContext.
>>> These were coupled together from various posts I’ve. None of these steps
>>> have worked. I’m still getting a ton of logging to the console. Anything
>>> else I can try?
>>>
>>> Thanks,
>>> Jeff
>>>
>>> private def disableLogging(): Unit = {
>>>   import org.apache.log4j.PropertyConfigurator
>>>
>>>   PropertyConfigurator.configure("conf/log4j.properties")
>>>   Logger.getRootLogger().setLevel(Level.OFF)
>>>   Logger.getLogger("org").setLevel(Level.OFF)
>>>   Logger.getLogger("akka").setLevel(Level.OFF)
>>> }
>>>
>>>
>>>
>>> This message (and any attachments) is intended only for the designated
>>> recipient(s). It
>>> may contain confidential or proprietary information, or have other
>>> limitations on use as
>>> indicated by the sender. If you are not a designated recipient, you may
>>> not review, use,
>>> copy or distribute this message. If you received this in error, please
>>> notify the sender by
>>> reply e-mail and delete this message.
>>>
>>
>>
>>
>> --
>> Alex Kozlov
>> (408) 507-4987
>> (408) 830-9982 fax
>> (650) 887-2135 efax
>> ale...@gmail.com
>>
>>
>> This message (and any attachments) is intended only for the designated
>> recipient(s). It
>> may contain confidential or proprietary information, or have other
>> limitations on use as
>> indicated by the sender. If you are not a designated recipient, you may
>> not review, use,
>> copy or distribute this message. If you received this in error, please
>> notify the sender by
>> reply e-mail and delete this message.
>>
>
>

Reply via email to