spark-shell has syntax error on windows.

2015-01-22 Thread Vladimir Protsenko
I have a problem with running spark shell in windows 7. I made the following steps: 1. downloaded and installed Scala 2.11.5 2. downloaded spark 1.2.0 by git clone git://github.com/apache/spark.git 3. run dev/change-version-to-2.11.sh and mvn -Dscala-2.11 -DskipTests clean package (in git bash) A

Re: spark-shell has syntax error on windows.

2015-01-24 Thread Vladimir Protsenko
s 7 as well. I've never bothered looking to fix it >>> as it seems spark-shell just calls spark-shell2 anyway... >>> >>> On Thu, Jan 22, 2015 at 3:16 AM, Vladimir Protsenko < >>> protsenk...@gmail.com> wrote: >>> >>>> I have a problem w

saveAsHadoopFile is not a member of ... RDD[(String, MyObject)]

2015-02-12 Thread Vladimir Protsenko
Hi. I am stuck with how to save file to hdfs from spark. I have written MyOutputFormat extends FileOutputFormat, then in spark calling this: rddres.saveAsHadoopFile[MyOutputFormat]("hdfs://localhost/output") or rddres.saveAsHadoopFile("hdfs://localhost/output", classOf[String], classOf[MyObj

Re: saveAsHadoopFile is not a member of ... RDD[(String, MyObject)]

2015-02-12 Thread Vladimir Protsenko
K-4397) > > On Thu, Feb 12, 2015 at 9:36 AM, Vladimir Protsenko > wrote: > >> Hi. I am stuck with how to save file to hdfs from spark. >> >> I have written MyOutputFormat extends FileOutputFormat, >> then in spark calling this: >> >> rdd

Re: saveAsHadoopFile is not a member of ... RDD[(String, MyObject)]

2015-02-12 Thread Vladimir Protsenko
19:42 GMT+04:00 Ted Yu : > You can use JavaPairRDD which has: > > override def wrapRDD(rdd: RDD[(K, V)]): JavaPairRDD[K, V] = > JavaPairRDD.fromRDD(rdd) > > Cheers > > On Thu, Feb 12, 2015 at 7:36 AM, Vladimir Protsenko > wrote: > >> Hi. I am stuck with how to s

Re: saveAsHadoopFile is not a member of ... RDD[(String, MyObject)]

2015-02-13 Thread Vladimir Protsenko
> grapple with ClassTag from Java for example. > > There is not an implicit conversion since it is used from Java, which > doesn't have implicits. > > On Fri, Feb 13, 2015 at 5:57 AM, Vladimir Protsenko > wrote: > > Thank's for reply. I solved porblem with im

Map task in Trident.

2015-03-05 Thread Vladimir Protsenko
There is a map function in clojure so you could map one collection to other. The most resembling operation is *each*, however when f applied to input tuple we get tuple with two fields* f(["field-a"]) = [ "field-a" "field-b"]*. How could I realize the same operation on trident stream?

Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Vladimir Protsenko
I am installing Spark 1.2.0 on CentOS 6.6. Just downloaded code from github, installed maven and trying to compile system: git clone https://github.com/apache/spark.git git checkout v1.2.0 mvn -DskipTests clean package leads to OutOfMemoryException. What amount of memory does it requires? expor

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Vladimir Protsenko
I wrongly send only to Sean. I have tried export MAVEN_OPTS=`-Xmx=3g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=1g` and it doesn't work also. Best Regards, Vladimir Protsenko 2014-12-23 19:45 GMT+04:00 Guru Medasani : > Thanks for the clarification Sean. > > Best Regards, &

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Vladimir Protsenko
Thanks. Bad mistake. 2014-12-24 14:02 GMT+04:00 Sean Owen : > That command is still wrong. It is -Xmx3g with no =. > On Dec 24, 2014 9:50 AM, "Vladimir Protsenko" > wrote: > >> Java 8 rpm 64bit downloaded from official oracle site solved my problem. >> And I