Hi,

I applied the patch.

1) patched

$ patch -p0 -i spark-1297-v5.txt
patching file docs/building-with-maven.md
patching file examples/pom.xml


2) Compilation result
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM .......................... SUCCESS [1.550s]
[INFO] Spark Project Core ................................ SUCCESS [1:32.175s]
[INFO] Spark Project Bagel ............................... SUCCESS [10.809s]
[INFO] Spark Project GraphX .............................. SUCCESS [31.435s]
[INFO] Spark Project Streaming ........................... SUCCESS [44.518s]
[INFO] Spark Project ML Library .......................... SUCCESS [48.992s]
[INFO] Spark Project Tools ............................... SUCCESS [7.028s]
[INFO] Spark Project Catalyst ............................ SUCCESS [40.365s]
[INFO] Spark Project SQL ................................. SUCCESS [43.305s]
[INFO] Spark Project Hive ................................ SUCCESS [36.464s]
[INFO] Spark Project REPL ................................ SUCCESS [20.319s]
[INFO] Spark Project YARN Parent POM ..................... SUCCESS [1.032s]
[INFO] Spark Project YARN Stable API ..................... SUCCESS [19.379s]
[INFO] Spark Project Hive Thrift Server .................. SUCCESS [12.470s]
[INFO] Spark Project Assembly ............................ SUCCESS [13.822s]
[INFO] Spark Project External Twitter .................... SUCCESS [9.566s]
[INFO] Spark Project External Kafka ...................... SUCCESS [12.848s]
[INFO] Spark Project External Flume Sink ................. SUCCESS [10.437s]
[INFO] Spark Project External Flume ...................... SUCCESS [14.554s]
[INFO] Spark Project External ZeroMQ ..................... SUCCESS [9.994s]
[INFO] Spark Project External MQTT ....................... SUCCESS [8.684s]
[INFO] Spark Project Examples ............................ SUCCESS [1:31.610s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 9:41.700s
[INFO] Finished at: Sun Sep 14 23:51:56 HKT 2014
[INFO] Final Memory: 83M/1071M
[INFO] ------------------------------------------------------------------------



3) testing:  
scala> package org.apache.spark.examples
<console>:1: error: illegal start of definition
       package org.apache.spark.examples
       ^


scala> import org.apache.hadoop.hbase.client.HBaseAdmin
import org.apache.hadoop.hbase.client.HBaseAdmin

scala> import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor}
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor}

scala> import org.apache.hadoop.hbase.mapreduce.TableInputFormat
import org.apache.hadoop.hbase.mapreduce.TableInputFormat

scala> import org.apache.spark._
import org.apache.spark._

scala> object HBaseTest {
     | def main(args: Array[String]) {
     | val sparkConf = new SparkConf().setAppName("HBaseTest")
     | val sc = new SparkContext(sparkConf)
     | val conf = HBaseConfiguration.create()
     | // Other options for configuring scan behavior are available. More 
information available at
     | // 
http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/TableInputFormat.html
     | conf.set(TableInputFormat.INPUT_TABLE, args(0))
     | // Initialize hBase table if necessary
     | val admin = new HBaseAdmin(conf)
     | if (!admin.isTableAvailable(args(0))) {
     | val tableDesc = new HTableDescriptor(args(0))
     | admin.createTable(tableDesc)
     | }
     | val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat],
     | classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable],
     | classOf[org.apache.hadoop.hbase.client.Result])
     | hBaseRDD.count()
     | sc.stop()
     | }
     | }
warning: there were 1 deprecation warning(s); re-run with -deprecation for 
details
defined module HBaseTest



Now only got error when trying to run "package org.apache.spark.examples”

Please advise.
Regards
Arthur



On 14 Sep, 2014, at 11:41 pm, Ted Yu <yuzhih...@gmail.com> wrote:

> I applied the patch on master branch without rejects.
> 
> If you use spark 1.0.2, use pom.xml attached to the JIRA.
> 
> On Sun, Sep 14, 2014 at 8:38 AM, arthur.hk.c...@gmail.com 
> <arthur.hk.c...@gmail.com> wrote:
> Hi,
> 
> Thanks!
> 
> patch -p0 -i spark-1297-v5.txt
> patching file docs/building-with-maven.md
> patching file examples/pom.xml
> Hunk #1 FAILED at 45.
> Hunk #2 FAILED at 110.
> 2 out of 2 hunks FAILED -- saving rejects to file examples/pom.xml.rej
> 
> Still got errors.
> 
> Regards
> Arthur
> 
> On 14 Sep, 2014, at 11:33 pm, Ted Yu <yuzhih...@gmail.com> wrote:
> 
>> spark-1297-v5.txt is level 0 patch
>> 
>> Please use spark-1297-v5.txt
>> 
>> Cheers
>> 
>> On Sun, Sep 14, 2014 at 8:06 AM, arthur.hk.c...@gmail.com 
>> <arthur.hk.c...@gmail.com> wrote:
>> Hi,
>> 
>> Thanks!!
>> 
>> I tried to apply the patches, both spark-1297-v2.txt and spark-1297-v4.txt 
>> are good here,  but not spark-1297-v5.txt:
>> 
>> 
>> $ patch -p1 -i spark-1297-v4.txt
>> patching file examples/pom.xml
>> 
>> $ patch -p1 -i spark-1297-v5.txt
>> can't find file to patch at input line 5
>> Perhaps you used the wrong -p or --strip option?
>> The text leading up to this was:
>> --------------------------
>> |diff --git docs/building-with-maven.md docs/building-with-maven.md
>> |index 672d0ef..f8bcd2b 100644
>> |--- docs/building-with-maven.md
>> |+++ docs/building-with-maven.md
>> --------------------------
>> File to patch: 
>> 
>> 
>> 
>> 
>> 
>> 
>> Please advise.
>> Regards
>> Arthur
>> 
>> 
>> 
>> On 14 Sep, 2014, at 10:48 pm, Ted Yu <yuzhih...@gmail.com> wrote:
>> 
>>> Spark examples builds against hbase 0.94 by default.
>>> 
>>> If you want to run against 0.98, see:
>>> SPARK-1297 https://issues.apache.org/jira/browse/SPARK-1297
>>> 
>>> Cheers
>>> 
>>> On Sun, Sep 14, 2014 at 7:36 AM, arthur.hk.c...@gmail.com 
>>> <arthur.hk.c...@gmail.com> wrote:
>>> Hi, 
>>> 
>>> I have tried to to run HBaseTest.scala, but I  got following errors, any 
>>> ideas to how to fix them?
>>> 
>>> Q1) 
>>> scala> package org.apache.spark.examples
>>> <console>:1: error: illegal start of definition
>>>        package org.apache.spark.examples
>>> 
>>> 
>>> Q2) 
>>> scala> import org.apache.hadoop.hbase.mapreduce.TableInputFormat
>>> <console>:31: error: object hbase is not a member of package 
>>> org.apache.hadoop
>>>        import org.apache.hadoop.hbase.mapreduce.TableInputFormat
>>> 
>>> 
>>> 
>>> Regards
>>> Arthur
>>> 
>> 
>> 
>> 
> 
> 

Reply via email to