Thanks Akhil. Both ways work for me, but I'd like to know why that
exception was thrown. The class HBaseApp and related class were all
contained in my application jar, why was *com.xt.scala.HBaseApp$$*
*anonfun$testHBase$1* not found ?
2014-10-13 14:53 GMT+08:00 Akhil Das :
> Adding your applica
Adding your application jar to the sparkContext will resolve this issue.
Eg:
sparkContext.addJar("./target/scala-2.10/myTestApp_2.10-1.0.jar")
Thanks
Best Regards
On Mon, Oct 13, 2014 at 8:42 AM, Tao Xiao wrote:
> In the beginning I tried to read HBase and found that exception was
> thrown, th
In the beginning I tried to read HBase and found that exception was thrown,
then I start to debug the app. I removed the codes reading HBase and tried
to save an rdd containing a list and the exception was still thrown. So I'm
sure that exception was not caused by reading HBase.
While debugging I
Your app is named scala.HBaseApp
Does it read / write to HBase ?
Just curious.
On Sun, Oct 12, 2014 at 8:00 AM, Tao Xiao wrote:
> Hi all,
>
> I'm using CDH 5.0.1 (Spark 0.9) and submitting a job in Spark Standalone
> Cluster mode.
>
> The job is quite simple as follows:
>
> object HBaseApp {