I didn't see that problem.
Did you run this command ?

mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests
clean package

Here is what I got:

TYus-MacBook-Pro:spark-1.0.2 tyu$ sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to
/Users/tyu/spark-1.0.2/sbin/../logs/spark-tyu-org.apache.spark.deploy.master.Master-1-TYus-MacBook-Pro.local.out
localhost: ssh: connect to host localhost port 22: Connection refused

TYus-MacBook-Pro:spark-1.0.2 tyu$ vi
logs/spark-tyu-org.apache.spark.deploy.master.Master-1-TYus-MacBook-Pro.local.out
TYus-MacBook-Pro:spark-1.0.2 tyu$ jps
11563 Master
11635 Jps

TYus-MacBook-Pro:spark-1.0.2 tyu$ ps aux | grep 11563
tyu             11563   0.7  0.8  3333196 142444 s003  S     6:52AM
0:02.72
/Library/Java/JavaVirtualMachines/jdk1.7.0_60.jdk/Contents/Home/bin/java
-cp
::/Users/tyu/spark-1.0.2/conf:/Users/tyu/spark-1.0.2/assembly/target/scala-2.10/spark-assembly-1.0.2-hadoop2.4.1.jar
-XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
org.apache.spark.deploy.master.Master --ip TYus-MacBook-Pro.local --port
7077 --webui-port 8080

TYus-MacBook-Pro:spark-1.0.2 tyu$ ls -l
assembly/target/scala-2.10/spark-assembly-1.0.2-hadoop2.4.1.jar
-rw-r--r--  1 tyu  staff  121182305 Aug 27 21:13
assembly/target/scala-2.10/spark-assembly-1.0.2-hadoop2.4.1.jar

Cheers


On Thu, Aug 28, 2014 at 3:42 AM, arthur.hk.c...@gmail.com <
arthur.hk.c...@gmail.com> wrote:

> Hi,
>
> I tried to start Spark but failed:
>
> $ ./sbin/start-all.sh
> starting org.apache.spark.deploy.master.Master, logging to
> /mnt/hadoop/spark-1.0.2/sbin/../logs/spark-edhuser-org.apache.spark.deploy.master.Master-1-m133.out
> failed to launch org.apache.spark.deploy.master.Master:
>   Failed to find Spark assembly in
> /mnt/hadoop/spark-1.0.2/assembly/target/scala-2.10/
>
> $ ll assembly/
> total 20
> -rw-rw-r--. 1 hduser hadoop 11795 Jul 26 05:50 pom.xml
> -rw-rw-r--. 1 hduser hadoop   507 Jul 26 05:50 README
> drwxrwxr-x. 4 hduser hadoop  4096 Jul 26 05:50 *src*
>
>
>
> Regards
> Arthur
>
>
>
> On 28 Aug, 2014, at 6:19 pm, Ted Yu <yuzhih...@gmail.com> wrote:
>
> I see 0.98.5 in dep.txt
>
> You should be good to go.
>
>
> On Thu, Aug 28, 2014 at 3:16 AM, arthur.hk.c...@gmail.com <
> arthur.hk.c...@gmail.com> wrote:
>
>> Hi,
>>
>> tried
>> mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests
>> dependency:tree > dep.txt
>>
>> Attached the dep. txt for your information.
>>
>>
>> Regards
>> Arthur
>>
>> On 28 Aug, 2014, at 12:22 pm, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>> I forgot to include '-Dhadoop.version=2.4.1' in the command below.
>>
>> The modified command passed.
>>
>> You can verify the dependence on hbase 0.98 through this command:
>>
>> mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests
>> dependency:tree > dep.txt
>>
>> Cheers
>>
>>
>> On Wed, Aug 27, 2014 at 8:58 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> Looks like the patch given by that URL only had the last commit.
>>>
>>> I have attached pom.xml for spark-1.0.2 to SPARK-1297
>>> You can download it and replace examples/pom.xml with the downloaded pom
>>>
>>> I am running this command locally:
>>>
>>> mvn -Phbase-hadoop2,hadoop-2.4,yarn -DskipTests clean package
>>>
>>> Cheers
>>>
>>>
>>> On Wed, Aug 27, 2014 at 7:57 PM, arthur.hk.c...@gmail.com <
>>> arthur.hk.c...@gmail.com> wrote:
>>>
>>>> Hi Ted,
>>>>
>>>> Thanks.
>>>>
>>>> Tried [patch -p1 -i 1893.patch]    (Hunk #1 FAILED at 45.)
>>>> Is this normal?
>>>>
>>>> Regards
>>>> Arthur
>>>>
>>>>
>>>> patch -p1 -i 1893.patch
>>>> patching file examples/pom.xml
>>>> Hunk #1 FAILED at 45.
>>>> Hunk #2 succeeded at 94 (offset -16 lines).
>>>> 1 out of 2 hunks FAILED -- saving rejects to file examples/pom.xml.rej
>>>> patching file examples/pom.xml
>>>> Hunk #1 FAILED at 54.
>>>> Hunk #2 FAILED at 72.
>>>>  Hunk #3 succeeded at 122 (offset -49 lines).
>>>> 2 out of 3 hunks FAILED -- saving rejects to file examples/pom.xml.rej
>>>> patching file docs/building-with-maven.md
>>>> patching file examples/pom.xml
>>>> Hunk #1 succeeded at 122 (offset -40 lines).
>>>> Hunk #2 succeeded at 195 (offset -40 lines).
>>>>
>>>>
>>>> On 28 Aug, 2014, at 10:53 am, Ted Yu <yuzhih...@gmail.com> wrote:
>>>>
>>>> Can you use this command ?
>>>>
>>>> patch -p1 -i 1893.patch
>>>>
>>>> Cheers
>>>>
>>>>
>>>> On Wed, Aug 27, 2014 at 7:41 PM, arthur.hk.c...@gmail.com <
>>>> arthur.hk.c...@gmail.com> wrote:
>>>>
>>>>> Hi Ted,
>>>>>
>>>>> I tried the following steps to apply the patch 1893 but got Hunk
>>>>> FAILED, can you please advise how to get thru this error? or is my
>>>>> spark-1.0.2 source not the correct one?
>>>>>
>>>>> Regards
>>>>> Arthur
>>>>>
>>>>> wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
>>>>> tar -vxf spark-1.0.2.tgz
>>>>> cd spark-1.0.2
>>>>> wget https://github.com/apache/spark/pull/1893.patch
>>>>> patch  < 1893.patch
>>>>> patching file pom.xml
>>>>> Hunk #1 FAILED at 45.
>>>>> Hunk #2 FAILED at 110.
>>>>> 2 out of 2 hunks FAILED -- saving rejects to file pom.xml.rej
>>>>> patching file pom.xml
>>>>> Hunk #1 FAILED at 54.
>>>>> Hunk #2 FAILED at 72.
>>>>> Hunk #3 FAILED at 171.
>>>>> 3 out of 3 hunks FAILED -- saving rejects to file pom.xml.rej
>>>>> can't find file to patch at input line 267
>>>>> Perhaps you should have used the -p or --strip option?
>>>>> The text leading up to this was:
>>>>> --------------------------
>>>>> |
>>>>> |From cd58437897bf02b644c2171404ccffae5d12a2be Mon Sep 17 00:00:00 2001
>>>>> |From: tedyu <yuzhih...@gmail.com>
>>>>> |Date: Mon, 11 Aug 2014 15:57:46 -0700
>>>>> |Subject: [PATCH 3/4] SPARK-1297 Upgrade HBase dependency to 0.98 - add
>>>>> | description to building-with-maven.md
>>>>> |
>>>>> |---
>>>>> | docs/building-with-maven.md | 3 +++
>>>>> | 1 file changed, 3 insertions(+)
>>>>> |
>>>>> |diff --git a/docs/building-with-maven.md b/docs/
>>>>> building-with-maven.md
>>>>> |index 672d0ef..f8bcd2b 100644
>>>>> |--- a/docs/building-with-maven.md
>>>>> |+++ b/docs/building-with-maven.md
>>>>> --------------------------
>>>>> File to patch:
>>>>>
>>>>>
>>>>>
>>>>> On 28 Aug, 2014, at 10:24 am, Ted Yu <yuzhih...@gmail.com> wrote:
>>>>>
>>>>> You can get the patch from this URL:
>>>>> https://github.com/apache/spark/pull/1893.patch
>>>>>
>>>>> BTW 0.98.5 has been released - you can specify 0.98.5-hadoop2 in the
>>>>> pom.xml
>>>>>
>>>>> Cheers
>>>>>
>>>>>
>>>>> On Wed, Aug 27, 2014 at 7:18 PM, arthur.hk.c...@gmail.com <
>>>>> arthur.hk.c...@gmail.com> wrote:
>>>>>
>>>>>> Hi Ted,
>>>>>>
>>>>>> Thank you so much!!
>>>>>>
>>>>>> As I am new to Spark, can you please advise the steps about how to
>>>>>> apply this patch to my spark-1.0.2 source folder?
>>>>>>
>>>>>> Regards
>>>>>> Arthur
>>>>>>
>>>>>>
>>>>>> On 28 Aug, 2014, at 10:13 am, Ted Yu <yuzhih...@gmail.com> wrote:
>>>>>>
>>>>>> See SPARK-1297
>>>>>>
>>>>>>  The pull request is here:
>>>>>> https://github.com/apache/spark/pull/1893
>>>>>>
>>>>>>
>>>>>> On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com <
>>>>>> arthur.hk.c...@gmail.com> wrote:
>>>>>>
>>>>>>> (correction: "Compilation Error:  Spark 1.0.2 with HBase 0.98” ,
>>>>>>> please ignore if duplicated)
>>>>>>>
>>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2
>>>>>>> with HBase 0.98,
>>>>>>>
>>>>>>> My steps:
>>>>>>> wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
>>>>>>> tar -vxf spark-1.0.2.tgz
>>>>>>> cd spark-1.0.2
>>>>>>>
>>>>>>> edit project/SparkBuild.scala, set HBASE_VERSION
>>>>>>>   // HBase version; set as appropriate.
>>>>>>>   val HBASE_VERSION = "0.98.2"
>>>>>>>
>>>>>>>
>>>>>>> edit pom.xml with following values
>>>>>>>     <hadoop.version>2.4.1</hadoop.version>
>>>>>>>     <protobuf.version>2.5.0</protobuf.version>
>>>>>>>     <yarn.version>${hadoop.version}</yarn.version>
>>>>>>>     <hbase.version>0.98.5</hbase.version>
>>>>>>>     <zookeeper.version>3.4.6</zookeeper.version>
>>>>>>>     <hive.version>0.13.1</hive.version>
>>>>>>>
>>>>>>>
>>>>>>> SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
>>>>>>> but it fails because of UNRESOLVED DEPENDENCIES "hbase;0.98.2"
>>>>>>>
>>>>>>> Can you please advise how to compile Spark 1.0.2 with HBase 0.98? or
>>>>>>> should I set HBASE_VERSION back to “0.94.6"?
>>>>>>>
>>>>>>> Regards
>>>>>>> Arthur
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>>>>>>> [warn]  ::          UNRESOLVED DEPENDENCIES         ::
>>>>>>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>>>>>>> [warn]  :: org.apache.hbase#hbase;0.98.2: not found
>>>>>>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>>>>>>>
>>>>>>> sbt.ResolveException: unresolved dependency:
>>>>>>> org.apache.hbase#hbase;0.98.2: not found
>>>>>>>         at
>>>>>>> sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
>>>>>>>         at
>>>>>>> sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
>>>>>>>         at
>>>>>>> sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:125)
>>>>>>>         at
>>>>>>> sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
>>>>>>>         at
>>>>>>> sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
>>>>>>>         at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:104)
>>>>>>>         at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:51)
>>>>>>>         at sbt.IvySbt$$anon$3.call(Ivy.scala:60)
>>>>>>>         at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
>>>>>>>         at
>>>>>>> xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:81)
>>>>>>>         at
>>>>>>> xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
>>>>>>>         at xsbt.boot.Using$.withResource(Using.scala:11)
>>>>>>>         at xsbt.boot.Using$.apply(Using.scala:10)
>>>>>>>         at
>>>>>>> xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
>>>>>>>         at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
>>>>>>>         at xsbt.boot.Locks$.apply0(Locks.scala:31)
>>>>>>>         at xsbt.boot.Locks$.apply(Locks.scala:28)
>>>>>>>         at sbt.IvySbt.withDefaultLogger(Ivy.scala:60)
>>>>>>>         at sbt.IvySbt.withIvy(Ivy.scala:101)
>>>>>>>         at sbt.IvySbt.withIvy(Ivy.scala:97)
>>>>>>>         at sbt.IvySbt$Module.withModule(Ivy.scala:116)
>>>>>>>         at sbt.IvyActions$.update(IvyActions.scala:125)
>>>>>>>         at
>>>>>>> sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1170)
>>>>>>>         at
>>>>>>> sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1168)
>>>>>>>         at
>>>>>>> sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1191)
>>>>>>>         at
>>>>>>> sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1189)
>>>>>>>         at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
>>>>>>>         at
>>>>>>> sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1193)
>>>>>>>         at
>>>>>>> sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1188)
>>>>>>>         at
>>>>>>> sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
>>>>>>>         at sbt.Classpaths$.cachedUpdate(Defaults.scala:1196)
>>>>>>>         at
>>>>>>> sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1161)
>>>>>>>         at
>>>>>>> sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1139)
>>>>>>>         at
>>>>>>> scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
>>>>>>>         at
>>>>>>> sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
>>>>>>>         at sbt.std.Transform$$anon$4.work(System.scala:64)
>>>>>>>         at
>>>>>>> sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
>>>>>>>         at
>>>>>>> sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
>>>>>>>         at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
>>>>>>>         at sbt.Execute.work(Execute.scala:244)
>>>>>>>         at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
>>>>>>>         at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
>>>>>>>         at
>>>>>>> sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
>>>>>>>         at
>>>>>>> sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
>>>>>>>         at
>>>>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>>>>         at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>>>>>>>         at
>>>>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>>>>         at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>>>>>>>         at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>>>>>         at java.lang.Thread.run(Thread.java:662)
>>>>>>> [error] (examples/*:update) sbt.ResolveException: unresolved
>>>>>>> dependency: org.apache.hbase#hbase;0.98.2: not found
>>>>>>> [error] Total time: 270 s, completed Aug 28, 2014 9:42:05 AM
>>>>>>> ---------------------------------------------------------------------
>>>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>>
>>
>
>

Reply via email to