My fault, I didn't notice the "11" in the jar name. It is working now with
nscala-time_2.10-1.8.0.jar
Thanks Kevin
From: kevin...@apache.org
Date: Tue, 17 Feb 2015 08:58:13 +0000
Subject: Re: Use of nscala-time within spark-shell
To: hscha...@hotmail.com; kevin...@a
elow.
>
> Thanks
>
> --
> From: kevin...@apache.org
> Date: Tue, 17 Feb 2015 08:50:04 +0000
>
> Subject: Re: Use of nscala-time within spark-shell
> To: hscha...@hotmail.com; kevin...@apache.org; user@spark.apache.org
>
>
> Great, or you can just use
I can use nscala-time with scala, but my issue is that I can't use it witinh
spark-shell console! It gives my the error below.
Thanks
From: kevin...@apache.org
Date: Tue, 17 Feb 2015 08:50:04 +
Subject: Re: Use of nscala-time within spark-shell
To: hscha...@hotmail.com; kevin...@apach
2.11. I'll share
> the results here.
>
> Regards,
> --
> From: kevin...@apache.org
> Date: Tue, 17 Feb 2015 01:10:09 +
> Subject: Re: Use of nscala-time within spark-shell
> To: hscha...@hotmail.com; user@spark.apache.org
>
>
&g
Re: Use of nscala-time within spark-shell
To: hscha...@hotmail.com; user@spark.apache.org
What is your scala version used to build Spark?
It seems your nscala-time library scala version is 2.11,
and default Spark scala version is 2.10.
On Tue Feb 17 2015 at 1:51:47 AM Hammam CHAMSI wrote:
Hi All
What is your scala version used to build Spark?
It seems your nscala-time library scala version is 2.11,
and default Spark scala version is 2.10.
On Tue Feb 17 2015 at 1:51:47 AM Hammam CHAMSI wrote:
> Hi All,
>
> Thanks in advance for your help. I have timestamp which I need to convert
> to da
Hi All,
Thanks in advance for your help. I have timestamp which I need
to convert to datetime using scala. A folder contains the three needed
jar files: "joda-convert-1.5.jar joda-time-2.4.jar
nscala-time_2.11-1.8.0.jar"
Using scala REPL and adding the jars: scala -classpath "*.jar"
I can
tp://apache-spark-user-list.1001560.n3.nabble.com/Use-of-nscala-time-within-spark-shell-tp21624.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apac