Re: quick start guide: building a standalone scala program

2014-09-25 Thread Andrew Ash
p 25, 2014 at 1:00 AM, christy <760948...@qq.com> wrote: > I encountered exactly the same problem. How did you solve this? > > Thanks > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/quick-start-guide-building-a-standa

Re: quick start guide: building a standalone scala program

2014-09-25 Thread christy
I encountered exactly the same problem. How did you solve this? Thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/quick-start-guide-building-a-standalone-scala-program-tp3116p15125.html Sent from the Apache Spark User List mailing list archive at

Re: quick start guide: building a standalone scala program

2014-09-25 Thread christy
t how this happen? How the sbt know which location specific? And though it went smoothly, I didn't see any jar had been created. Pls help. Thanks, Christy -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/quick-start-guide-building-a-standalon

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Nan Zhu
Yes, actually even for spark, I mostly use the sbt I installed…..so always missing this issue…. If you can reproduce the problem with a spark-distribtued sbt…I suggest proposing a PR to fix the document, before 0.9.1 is officially released Best, -- Nan Zhu On Monday, March 24, 2014 at

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
It is suggested implicitly in giving you the command "./sbt/sbt". The separately installed sbt isn't in a folder called sbt, whereas Spark's version is. And more relevantly, just a few paragraphs earlier in the tutorial you execute the command "sbt/sbt assembly" which definitely refers to the spar

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Nan Zhu
I found that I never read the document carefully and I never find that Spark document is suggesting you to use Spark-distributed sbt…… Best, -- Nan Zhu On Monday, March 24, 2014 at 5:47 PM, Diana Carroll wrote: > Thanks for your help, everyone. Several folks have explained that I can >

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Yana Kadiyska
Diana, I think you are correct - I just installed wget http://mirror.symnds.com/software/Apache/incubator/spark/spark-0.9.0-incubating/spark-0.9.0-incubating-bin-cdh4.tgz and indeed I see the same error that you see It looks like in previous versions sbt-launch used to just come down in the pack

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Thanks for your help, everyone. Several folks have explained that I can surely solve the problem by installing sbt. But I'm trying to get the instructions working *as written on the Spark website*. The instructions not only don't have you install sbt separately...they actually specifically have

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Nan Zhu
Hi, Diana, You don’t need to use spark-distributed sbt just download sbt from its official website and set your PATH to the right place Best, -- Nan Zhu On Monday, March 24, 2014 at 4:30 PM, Diana Carroll wrote: > Yeah, that's exactly what I did. Unfortunately it doesn't work: > >

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Yana Kadiyska
Diana, I just tried it on a clean Ubuntu machine, with Spark 0.8 (since like other folks I had sbt preinstalled on my "usual" machine) I ran the command exactly as Ognen suggested and see Set current project to Simple Project (do you see this -- you should at least be seeing this) and then a bunch

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Ognen Duzlevski
Ah crud, I guess you are right, I am using the sbt I installed manually with my Scala installation. Well, here is what you can do: mkdir ~/bin cd ~/bin wget http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.1/sbt-launch.jar vi sbt Put the following contents into you

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Soumya Simanta
@Diana - you can set sbt manually for your project by following the instructions here. http://www.scala-sbt.org/release/docs/Getting-Started/Setup.html Manual Installation¶ Manual installation requires downloa

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Yeah, that's exactly what I did. Unfortunately it doesn't work: $SPARK_HOME/sbt/sbt package awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for reading (No such file or directory) Attempting to fetch sbt /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or d

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Ognen Duzlevski
You can use any sbt on your machine, including the one that comes with spark. For example, try: ~/path_to_spark/sbt/sbt compile ~/path_to_spark/sbt/sbt run Or you can just add that to your PATH by: export $PATH=$PATH:~/path_to_spark/sbt To make it permanent, you can add it to your ~/.bashrc

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Bharath Bhushan
Creating simple.sbt and src/ in $SPARK_HOME allows me to run a standalone scala program in the downloaded spark code tree. For example my directory layout is: $ ls spark-0.9.0-incubating-bin-hadoop2 … simple.sbt src … $ tree src src `-- main `-- scala `— SimpleApp.scala — Bharath On

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Thanks Ongen. Unfortunately I'm not able to follow your instructions either. In particular: > > sbt compile > sbt run This doesn't work for me because there's no program on my path called "sbt". The instructions in the Quick Start guide are specific that I should call "$SPARK_HOME/sbt/sbt".

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Thanks, Nan Zhu. You say that my problems are "because you are in Spark directory, don't need to do that actually , the dependency on Spark is resolved by sbt" I did try it initially in what I thought was a much more typical place, e.g. ~/mywork/sparktest1. But as I said in my email: (Just for

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Ognen Duzlevski
Diana, Anywhere on the filesystem you have read/write access (you need not be in your spark home directory): mkdir myproject cd myproject mkdir project mkdir target mkdir -p src/main/scala cp $mypath/$mymysource.scala src/main/scala/ cp $mypath/myproject.sbt . Make sure that myproject.sbt has

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Yana: Thanks. Can you give me a transcript of the actual commands you are running? THanks! Diana On Mon, Mar 24, 2014 at 3:59 PM, Yana Kadiyska wrote: > I am able to run standalone apps. I think you are making one mistake > that throws you off from there onwards. You don't need to put your app

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Yana Kadiyska
I am able to run standalone apps. I think you are making one mistake that throws you off from there onwards. You don't need to put your app under SPARK_HOME. I would create it in its own folder somewhere, it follows the rules of any standalone scala program (including the layout). In the giude, $SP

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Nan Zhu
Hi, Diana, See my inlined answer -- Nan Zhu On Monday, March 24, 2014 at 3:44 PM, Diana Carroll wrote: > Has anyone successfully followed the instructions on the Quick Start page of > the Spark home page to run a "standalone" Scala application? I can't, and I > figure I must be miss

quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Has anyone successfully followed the instructions on the Quick Start page of the Spark home page to run a "standalone" Scala application? I can't, and I figure I must be missing something obvious! I'm trying to follow the instructions here as close to "word for word" as possible: http://spark.apa