Hi Ankur
If you using standalone mode, your config is wrong. You should use "export
SPARK_DAEMON_MEMORY=xxx " in config/spark-env.sh. At least it works on my
spark 1.3.0 standalone mode machine.
BTW, The SPARK_DRIVER_MEMORY is used in Yarn mode and looks like the
standalone mode don't use this c
Great work!
On May 30, 2014 10:15 PM, "Ian Ferreira" wrote:
> Congrats
>
> Sent from my Windows Phone
> --
> From: Dean Wampler
> Sent: 5/30/2014 6:53 AM
> To: user@spark.apache.org
> Subject: Re: Announcing Spark 1.0.0
>
> Congratulations!!
>
>
> On Fri, May 30,
Hi Sandeep
I think you should use testRatings.mapToPair instead of testRatings.map.
So the code should be
JavaPairRDD usersProducts = training.mapToPair(
new PairFunction() {
public Tuple2 call(Rating r) throws
Exception {
re
Hi Prasad
Sorry for missing your reply.
https://gist.github.com/thegiive/10791823
Here it is.
Wisely Chen
On Fri, Apr 4, 2014 at 11:57 PM, Prasad wrote:
> Hi Wisely,
> Could you please post your pom.xml here.
>
> Thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list
Hi Praveen
What is your config about "* spark.local.dir" ? *
Is all your worker has this dir and all worker has right permission on this
dir?
I think this is the reason of your error
Wisely Chen
On Mon, Apr 14, 2014 at 9:29 PM, Praveen R wrote:
> Had below error while running shark queries on
Hi Andy
We are from Taiwan. We are already planning to have a Spark meetup.
We already have some resources like place and food budget. But we do need
some other resource.
Please contact me offline.
Thanks
Wisely Chen
On Tue, Apr 1, 2014 at 1:28 AM, Andy Konwinski wrote:
> Hi folks,
>
> We hav
This response is for Sai
The easiest way to verify your current Spark-Shell setting is just type
"sc.master"
IF your setting is correct, it should return
scala> sc.master
res0: String = spark://master.ip.url.com:5050
If your SPARK_MASTER_IP is not correct setting, it will response
scala> sc.mast
Hi
I am quite beginner in spark and I have similar issue last week. I don't
know if my issue is the same as yours. I found that my program's jar
contain protobuf and when I remove this dependency on my program's pom.xml,
rebuild my program and it works.
Here is how I solved my own issue.
Environ