Compiling from source with Scala 2.11 support fixed this issue. Thanks
again for the help!
On Tue, Sep 8, 2015 at 7:33 AM, Gheorghe Postelnicu <
gheorghe.posteln...@gmail.com> wrote:
> Good point. It is a pre-compiled Spark version. Based on the text on the
> downloads page, the an
led with scala
> 2.11?
>
> On Mon, Sep 7, 2015 at 5:29 PM, Gheorghe Postelnicu <
> gheorghe.posteln...@gmail.com> wrote:
>
>> sbt assembly; $SPARK_HOME/bin/spark-submit --class main.scala.TestMain
>> --master "local[4]" target/scala-2.11/bof-assembly-0.1-SN
gt;
>
> El lunes, 7 de septiembre de 2015, Gheorghe Postelnicu <
> gheorghe.posteln...@gmail.com> escribió:
>
>> Interesting idea. Tried that, didn't work. Here is my new SBT file:
>>
>> name := """testMain"""
>>
>>
ryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.6"
>
>
> I believe that spark shades the scala library, and this is a library that it
> looks like you need in an unshaded way.
>
>
> 2015-09-07 16:48 GMT-04:00 Gheorghe Postelnicu <
&g
Hi,
The following code fails when compiled from SBT:
package main.scala
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
object TestMain {
def main(args: Array[String]): Unit = {
implicit val sparkContext = new SparkContext()
val sqlContext = new SQLContext(