I am running on m3.xlarge instances on AWS with 12 gb worker memory and 10
gb executor memory.
On Sun, Feb 1, 2015, 12:41 PM Arush Kharbanda
wrote:
> What is the machine configuration you are running it on?
>
> On Mon, Feb 2, 2015 at 1:46 AM, Ankur Srivastava <
> ankur.srivast...@gmail.com> wrot
Can you share your log4j file.
On Sat, Jan 31, 2015 at 1:35 PM, Arush Kharbanda wrote:
> Hi Ankur,
>
> Its running fine for me for spark 1.1 and changes to log4j properties
> file.
>
> Thanks
> Arush
>
> On Fri, Jan 30, 2015 at 9:49 PM, Ankur Srivastava <
> ankur.srivast...@gmail.com> wrote:
>
>
Hi Ankur,
Its running fine for me for spark 1.1 and changes to log4j properties file.
Thanks
Arush
On Fri, Jan 30, 2015 at 9:49 PM, Ankur Srivastava <
ankur.srivast...@gmail.com> wrote:
> Hi Arush
>
> I have configured log4j by updating the file log4j.properties in
> SPARK_HOME/conf folder.
>
>
Hi Arush
I have configured log4j by updating the file log4j.properties in
SPARK_HOME/conf folder.
If it was a log4j defect we would get error in debug mode in all apps.
Thanks
Ankur
Hi Ankur,
How are you enabling the debug level of logs. It should be a log4j
configuration. Even if there would
Hi Ankur,
How are you enabling the debug level of logs. It should be a log4j
configuration. Even if there would be some issue it would be in log4j and
not in spark.
Thanks
Arush
On Fri, Jan 30, 2015 at 4:24 AM, Ankur Srivastava <
ankur.srivast...@gmail.com> wrote:
> Hi,
>
> When ever I enable D
Hi,
When ever I enable DEBUG level logs for my spark cluster, on running a job
all the executors die with the below exception. On disabling the DEBUG logs
my jobs move to the next step.
I am on spark-1.1.0
Is this a known issue with spark?
Thanks
Ankur
2015-01-29 22:27:42,467 [main] INFO org