Which version of spark are you using?
On Tue, Sep 20, 2022, 1:57 PM Akash Vellukai
wrote:
> Hello,
>
>
> py4j.protocol.Py4JJavaError: An error occurred while calling o80.load. :
> java.lang.NoClassDefFoundError:
> org/apache/spark/sql/internal/connector/SimpleTableProvider
>
>
> May anyone hel
Hello,
py4j.protocol.Py4JJavaError: An error occurred while calling o80.load. :
java.lang.NoClassDefFoundError:
org/apache/spark/sql/internal/connector/SimpleTableProvider
May anyone help Me to solve this issue.
Thanks and regards
Akash
I am curious why you use the 1.0.4 java artifact with the latest 1.1.0? This
might be your compilation problem - The older java version.
com.datastax.spark
spark-cassandra-connector_2.10
1.1.0
com.datastax.spark
spark-cassandra-connector-java_2.10
1.0.4
See:
- doc
https:
This seems to be compilation errors. The second one seems to be that
you are using CassandraJavaUtil.javafunctions wrong. Look at the
documentation and set the parameter list correctly.
TD
On Mon, Dec 8, 2014 at 9:47 AM, wrote:
> Hi,
>
> I am intending to save the streaming data from kafka int
Hi,
I am intending to save the streaming data from kafka into Cassandra,
using spark-streaming:
But there seems to be problem with line
javaFunctions(data).writerBuilder("testkeyspace", "test_table",
mapToRow(TestTable.class)).saveToCassandra();
I am getting 2 errors.
the code, the error-log and