Fwd: Container preempted by scheduler - Spark job error

2016-06-02 Thread Prabeesh K.
cityScheduler being used ? > > Thanks > > On Thu, Jun 2, 2016 at 1:32 AM, Prabeesh K. wrote: > >> Hi I am using the below command to run a spark job and I get an error >> like "Container preempted by scheduler" >> >> I am not sure if it's rela

Container preempted by scheduler - Spark job error

2016-06-02 Thread Prabeesh K.
Hi I am using the below command to run a spark job and I get an error like "Container preempted by scheduler" I am not sure if it's related to the wrong usage of Memory: nohup ~/spark1.3/bin/spark-submit \ --num-executors 50 \ --master yarn \ --deploy-mode cluster \ --queue adhoc \ --driver-memor

Re: Spark + Jupyter (IPython Notebook)

2015-08-18 Thread Prabeesh K.
Refer this post http://blog.prabeeshk.com/blog/2015/06/19/pyspark-notebook-with-docker/ Spark + Jupyter + Docker On 18 August 2015 at 21:29, Jerry Lam wrote: > Hi Guru, > > Thanks! Great to hear that someone tried it in production. How do you like > it so far? > > Best Regards, > > Jerry > > >

Re: Packaging Java + Python library

2015-04-13 Thread prabeesh k
Refer this post http://blog.prabeeshk.com/blog/2015/04/07/self-contained-pyspark-application/ On 13 April 2015 at 17:41, Punya Biswal wrote: > Dear Spark users, > > My team is working on a small library that builds on PySpark and is > organized like PySpark as well -- it has a JVM component (tha

Re: How to learn Spark ?

2015-04-02 Thread prabeesh k
You can also refer this blog http://blog.prabeeshk.com/blog/archives/ On 2 April 2015 at 12:19, Star Guo wrote: > Hi, all > > > > I am new to here. Could you give me some suggestion to learn Spark ? > Thanks. > > > > Best Regards, > > Star Guo >

Re: Beginner in Spark

2015-02-10 Thread prabeesh k
Refer this blog for step by step installation of Spark on Ubuntu On 7 February 2015 at 03:12, Matei Zaharia wrote: > You don't need HDFS or virtual machines to run Spark. You can just > download it, unzip it an

Re: Spark installation

2015-02-10 Thread prabeesh k
Refer this blog for step by step installation On 11 February 2015 at 03:42, Mohit Singh wrote: > For local machine, I dont think there is any to install.. Just unzip and > go to $SPARK_DIR/bin/spark-shell and t

Re: Need a partner

2015-02-10 Thread prabeesh k
Also you can refer this course in edx: Introduction to Big Data with Apache Spark On 11 February 201

Re: Kestrel and Spark Stream

2014-11-18 Thread prabeesh k
You can refer the following link https://github.com/prabeesh/Spark-Kestrel On Tue, Nov 18, 2014 at 3:51 PM, Akhil Das wrote: > You can implement a custom receiver > <http://spark.apache.org/docs/latest/streaming-custom-receivers.html> to > connect to Kestrel and use it. I thi

Re: Unable to run a Standalone job

2014-06-05 Thread prabeesh k
try sbt clean command before build the app. or delete .ivy2 ans .sbt folders(not a good methode). Then try to rebuild the project. On Thu, Jun 5, 2014 at 11:45 AM, Sean Owen wrote: > I think this is SPARK-1949 again: https://github.com/apache/spark/pull/906 > I think this change fixed this is

Re: Can't seem to link "external/twitter" classes from my own app

2014-06-05 Thread prabeesh k
nds Build { lazy val root = Project("root", file(".")) dependsOn( uri("git://github.com/sbt/sbt-assembly.git#0.11.4") ) } Then try *sbt assembly.* Let me know is it working or not. Regards, prabeesh On Thu, Jun 5, 2014 at 1:16 PM, Nick Pentreath wrote: >

Re: Re: mismatched hdfs protocol

2014-06-05 Thread prabeesh k
you! > > i am developping a java project in Eclipse IDE on Windows > in which spark 1.0.0 libraries are imported > and now i want to open HDFS files as input > the hadoop version of HDFS is 2.4.0 > > 2014-06-05 > -- > bluejoe2008 > > *From:

Re: mismatched hdfs protocol

2014-06-04 Thread prabeesh k
For building Spark for particular version of Hadoop Refer http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html On Thu, Jun 5, 2014 at 8:14 AM, Koert Kuipers wrote: > you have to build spark against the version of hadoop your are using > > > On Wed, Jun 4, 2014 at 10:25 PM,

Unable to execute saveAsTextFile on multi node mesos

2014-05-31 Thread prabeesh k
/05/30 10:16:35 INFO DAGScheduler: Executor lost: 201405291518-3644595722-5050-17933-1 (epoch 148)* Let me know your thoughts regarding this. Regards, prabeesh

Re: Announcing Spark 1.0.0

2014-05-30 Thread prabeesh k
I forgot to hard refresh. thanks On Fri, May 30, 2014 at 4:18 PM, Patrick Wendell wrote: > It is updated - try holding "Shift + refresh" in your browser, you are > probably caching the page. > > On Fri, May 30, 2014 at 3:46 AM, prabeesh k wrote: > > Please update

Re: Announcing Spark 1.0.0

2014-05-30 Thread prabeesh k
Please update the http://spark.apache.org/docs/latest/ link On Fri, May 30, 2014 at 4:03 PM, Margusja wrote: > Is it possible to download pre build package? > http://mirror.symnds.com/software/Apache/incubator/ > spark/spark-1.0.0/spark-1.0.0-bin-hadoop2.tgz - gives me 404 > > Best regards, Ma

Re: spark job stuck when running on mesos fine grained mode

2014-05-29 Thread prabeesh
Hi Lukasz Jastrzebski , I have a question regarding Shark execution on mesos. I am querying a file which is in hdfs and write results back to hdfs. The problem I am facing in this is , unable to write output to hdfs. ie, when I use the method SaveAsTextfile() , then the job is getting resubmitted

java.lang.OutOfMemoryError while running Shark on Mesos

2014-05-22 Thread prabeesh k
ask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) Please help me to resolve this. Thanks in adv regards, prabeesh

Re: Better option to use Querying in Spark

2014-05-05 Thread prabeesh k
Thank you for your prompt reply. Regards, prabeesh On Tue, May 6, 2014 at 11:44 AM, Mayur Rustagi wrote: > All three have different usecases. If you are looking for more of a > warehouse you are better off with Shark. > SparkSQL is a way to query regular data in sql like syntax l

Better option to use Querying in Spark

2014-05-05 Thread prabeesh k
Regards. prabeesh

Re: Compile SimpleApp.scala encountered error, please can any one help?

2014-04-11 Thread prabeesh k
ensure the only one SimpleApp object in your project, also check is there any copy of SimpleApp.scala. Normally the file SimpleApp.scala in src/main/scala or in the project root folder. On Sat, Apr 12, 2014 at 11:07 AM, jni2000 wrote: > Hi > > I am a new Spark user and try to test run it from

Re: Spark packaging

2014-04-09 Thread prabeesh k
Please refer http://prabstechblog.blogspot.in/2014/04/creating-single-jar-for-spark-project.html Regards, prabeesh On Wed, Apr 9, 2014 at 1:04 PM, Pradeep baji wrote: > Hi all, > > I am new to spark and trying to learn it. Is there any document which > describes how spark is pack

[BLOG] For Beginners

2014-04-07 Thread prabeesh k
://prabstechblog.blogspot.in/2014/04/creating-single-jar-for-spark-project.html prabeesh