/MAVEN/PluginExecutionException>*
*[ERROR] *
*[ERROR] After correcting the problems, you can resume the build with the
command*
*[ERROR] mvn -rf :spark-core_2.10*
*Regards*
*Raghuveer*
On Fri, Oct 30, 2015 at 1:18 PM, Raghuveer Chanda <
raghuveer.cha...@gmail.com> wrote:
> Thanks for t
d/mnv -DskiptTests clean package
>
> Can you check it build/mvn was started successfully, or it's using your
> own mvn? Let us know your jdk version as well.
>
> On Thu, Oct 29, 2015 at 11:34 PM, Raghuveer Chanda <
> raghuveer.cha...@gmail.com> wrote:
>
>> Hi,
>
.
CompileFailed -> [Help 1]
--
Regards,
Raghuveer Chanda
t as provided in sbt.
>
> -adrian
>
> From: Raghuveer Chanda
> Date: Wednesday, October 21, 2015 at 2:14 PM
> To: Jean-Baptiste Onofré
> Cc: "user@spark.apache.org"
> Subject: Re: Spark on Yarn
>
> Hi,
>
> So does this mean I can't run spark 1.4 fat j
er side) and client version diverge on spark
> network JavaUtils. You should use the same/aligned version.
>
> Regards
> JB
>
>
>
> Sent from my Samsung device
>
>
> ---- Original message
> From: Raghuveer Chanda
> Date: 21/10/2015 12:33 (GMT+01:00)
&
ils: Shutdown hook called
Please help :)
--
Regards and Thanks,
Raghuveer Chanda
Hi,
Has anyone tried Mosek <http://www.mosek.com/> Solver in Spark?
I getting weird serialization errors. I came to know that Mosek uses shared
libraries which may not be serialized.
Is this the reason that they are not serialized or Is it working for anyone.
--
Regards,
Raghuveer Chan
he Hadoop user
> list. The Hadoop web interfaces for both the NameNode and ResourceManager
> are enabled by default. Is it possible you have a firewall blocking those
> ports?
>
> -Sandy
>
> On Wed, Sep 24, 2014 at 9:00 PM, Raghuveer Chanda <
> raghuveer.cha...@gmail.com
fault option for
security the Web Interface is disabled.
How can i enable the web interface i.e is there any option in cloudera or
is the server firewall is blocking it .. Please help ..
--
Regards,
Raghuveer Chanda
4th year Undergraduate Student
Computer Science and Engineering
IIT Kharagpur
27;t look at the logs through the UI after the app
> stops.
>
> On Wed, Sep 24, 2014 at 11:16 AM, Raghuveer Chanda
> wrote:
> >
> > Thanks for the reply .. This is the error in the logs obtained from UI at
> >
> http://dml3:8042/node/containerlogs/container_1411578463780_0
ing it) to look at it.
>
> My guess is that your cluster doesn't have enough resources available
> to service the container request you're making. That will show up in
> the driver as periodic messages that no containers have been allocated
> yet.
>
> On Wed, Sep 24, 2014
> On Sep 24, 2014, at 11:25 AM, Raghuveer Chanda
> wrote:
>
> Hi,
>
> I'm new to spark and facing problem with running a job in cluster using
> YARN.
>
> Initially i ran jobs using spark master as --master spark://dml2:7077 and
> it is running fine on 3 workers
our environment to point Spark to
> where your yarn configs are?
>
> Greg
>
> From: Raghuveer Chanda
> Date: Wednesday, September 24, 2014 12:25 PM
> To: "u...@spark.incubator.apache.org"
> Subject: Spark with YARN
>
> Hi,
>
> I'm new to spark
13 matches
Mail list logo