RE: SPARKSQL problem with implementing Scala's Product interface

2014-07-10 Thread Haoming Zhang
our SimpleApp; if so > you should include the hierarchal name. > > Zongheng > > On Thu, Jul 10, 2014 at 11:33 AM, Haoming Zhang > wrote: > > Hi Yadid, > > > > I have the same problem with you so I implemented the product interface as > > well, even the cod

RE: SPARKSQL problem with implementing Scala's Product interface

2014-07-10 Thread Haoming Zhang
Hi Yadid, I have the same problem with you so I implemented the product interface as well, even the codes are similar with your codes. But now I face another problem that is I don't know how to run the codes...My whole program is like this: object SimpleApp { class Record(val x1: String, va

RE: SparkSQL with sequence file RDDs

2014-07-07 Thread Haoming Zhang
Hi Michael, Thanks for the reply. Actually last week I tried to play with Product interface, but I'm not really sure I did correct or not. Here is what I did: 1. Created an abstract class A with Product interface, which has 20 parameters, 2. Created case class B extends A, and B has 20 paramete

RE: SparkSQL with sequence file RDDs

2014-07-07 Thread Haoming Zhang
Hi Gray, Like Michael mentioned, you need to take care of the scala case classes or java beans, because SparkSQL need the schema. Currently we are trying insert our data to HBase with Scala 2.10.4 and Spark 1.0. All the data are tables. We created one case class for each rows, which means th

RE: problem when start spark streaming in cluster mode

2014-06-27 Thread Haoming Zhang
Hi Siyuan, Can you try this solution? http://stackoverflow.com/questions/21943353/akka-2-3-0-fails-to-load-slf4jeventhandler-class-with-java-lang-classnotfounde Best Date: Fri, 27 Jun 2014 14:18:59 -0400 Subject: problem when start spark streaming in cluster mode From: hsy...@gmail.com To: use

Error in run spark.ContextCleaner under Spark 1.0.0

2014-06-23 Thread Haoming Zhang
Hi all, I tried to run a simple Spark Streaming program with sbt. The compile process was correct, but when I run the program I will get an error: "ERROR spark.ContextCleaner: Error in cleaning thread" I'm not sure this is a bug or something, because I can get the running result as I expected,