hanks
> Best Regards
>
> On Sat, Sep 20, 2014 at 3:34 PM, Moshe Beeri <[hidden email]
> <http://user/SendEmail.jtp?type=node&node=14785&i=0>> wrote:
>
>> Hi Sean,
>>
>> Thanks a lot for the answer , I loved your excellent book
>> *Mahout
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
*val numAs = logData.filter(line => line.contains("a")).count()//
<- here is where the exception thrown *
Do you have any idea whats wrong?
Thanks,
Moshe Beeri.
**
תודה רבה,
משה בארי.
al numAs = logData.filter(line => line.contains("a")).count()//
<- here is where the exception thrown *
Any help will be welcome
תודה רבה,
משה בארי.
054-3133943
Email | linkedin <http://www.linkedin.com/in/mobee>
On Sat, Sep 20, 2014 at 11:22 AM, Moshe Beeri
Thank Manu,
I just saw I have included hadoop client 2.x in my pom.xml, removing it
solved the problem.
Thanks for you help
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718p14721.html
Sent from the
object Nizoz {
def connect(): Unit = {
val conf = new SparkConf().setAppName("nizoz").setMaster("master");
val spark = new SparkContext(conf)
val lines =
spark.textFile("file:///home/moshe/store/frameworks/spark-1.1.0-bin-hadoop1/README.md")
val lineLengths = lines.map(s => s.len