Hello

I just stumbled on exactly the same issue as you are discussing in this
thread. Here are my dependencies:
<dependencies>
        
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector-java_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>

        
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.1.2-SNAPSHOT</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.1.2-SNAPSHOT</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.1.2-SNAPSHOT</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>

As you can see I am using the latest of Spark and Spark Cassandra Connector
and I still get the same error message:
Exception in thread "main" java.util.NoSuchElementException: head of empty
list

So, I don't believe this bug was really fixed in Spark 1.1.1 release as
reported above.

Did you problem get fixed with the latest Spark update?

Thanks,
Leon



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-parser-bug-tp15999p19793.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to