Deep, Yes you have another spark shell or application sticking around
somewhere. Try to inspect running processes and lookout for jave process.
And kill it.
This might be helpful
https://www.digitalocean.com/community/tutorials/how-to-use-ps-kill-and-nice-to-manage-processes-in-linux
Also, That
Yes, I have increased the driver memory in spark-default.conf to 2g. Still
the error persists.
On Tue, Jan 20, 2015 at 10:18 AM, Ted Yu wrote:
> Have you seen these threads ?
>
> http://search-hadoop.com/m/JW1q5tMFlb
> http://search-hadoop.com/m/JW1q5dabji1
>
> Cheers
>
> On Mon, Jan 19, 2015 at
Have you seen these threads ?
http://search-hadoop.com/m/JW1q5tMFlb
http://search-hadoop.com/m/JW1q5dabji1
Cheers
On Mon, Jan 19, 2015 at 8:33 PM, Deep Pradhan
wrote:
> Hi Ted,
> When I am running the same job with small data, I am able to run. But when
> I run it with relatively bigger set of
I closed the Spark Shell and tried but no change.
Here is the error:
.
15/01/17 14:33:39 INFO AbstractConnector: Started
SocketConnector@0.0.0.0:59791
15/01/17 14:33:39 INFO Server: jetty-8.y.z-SNAPSHOT
15/01/17 14:33:39 WARN AbstractLifeCycle: FAILED
SelectChannelConnector@0.0.0.0:40
I had the Spark Shell running through out. Is it because of that?
On Tue, Jan 20, 2015 at 9:47 AM, Ted Yu wrote:
> Was there another instance of Spark running on the same machine ?
>
> Can you pastebin the full stack trace ?
>
> Cheers
>
> On Mon, Jan 19, 2015 at 8:11 PM, Deep Pradhan
> wrote:
Was there another instance of Spark running on the same machine ?
Can you pastebin the full stack trace ?
Cheers
On Mon, Jan 19, 2015 at 8:11 PM, Deep Pradhan
wrote:
> Hi,
> I am running a Spark job. I get the output correctly but when I see the
> logs file I see the following:
> AbstractLifeC
Hi,
I am running a Spark job. I get the output correctly but when I see the
logs file I see the following:
AbstractLifeCycle: FAILED.: java.net.BindException: Address already in
use...
What could be the reason for this?
Thank You
Hello Neha,
This is the result of a known bug in 0.9. Can you try running the latest
Spark master branch to see if this problem is resolved?
TD
On Tue, Apr 22, 2014 at 2:48 AM, NehaS Singh wrote:
> Hi,
>
> I have installed
> spark-0.9.0-incubating-bin-cdh4 and
Hi,
I have installed
spark-0.9.0-incubating-bin-cdh4 and I am using apache flume for streaming. I
have used the streaming.examples.FlumeEventCount. Also I have written Avro conf
file for flume.When I try to do streamin ing spark and I run the following
command it