pache.org; user@spark.apache.org
Then, why don't you use nscala-time_2.10-1.8.0.jar, not
nscala-time_2.11-1.8.0.jar ?
On Tue Feb 17 2015 at 5:55:50 PM Hammam CHAMSI wrote:
I can use nscala-time with scala, but my issue is that I can't use it witinh
spark-shell console! It gives my the error be
e.org; user@spark.apache.org
Great, or you can just use nscala-time with scala 2.10!
On Tue Feb 17 2015 at 5:41:53 PM Hammam CHAMSI wrote:
Thanks Kevin for your reply,
I downloaded the pre_built version and as you said the default spark scala
version is 2.10. I'm now building spark
Re: Use of nscala-time within spark-shell
To: hscha...@hotmail.com; user@spark.apache.org
What is your scala version used to build Spark?
It seems your nscala-time library scala version is 2.11,
and default Spark scala version is 2.10.
On Tue Feb 17 2015 at 1:51:47 AM Hammam CHAMSI wrote:
Hi All
Hi All,
Thanks in advance for your help. I have timestamp which I need
to convert to datetime using scala. A folder contains the three needed
jar files: "joda-convert-1.5.jar joda-time-2.4.jar
nscala-time_2.11-1.8.0.jar"
Using scala REPL and adding the jars: scala -classpath "*.jar"
I can