I am trying to execute query using spark sql on SAP HANA  from spark shell. I
am able to create the data frame object. On calling any action on the data
frame object, I am getting* java.io.NotSerializableException.*

Steps I followed after adding saphana driver jar in spark class path.

1. Start spark-shell
2. val df = sqlContext.load("jdbc", Map("url" ->
"jdbc:sap://172.26.52.54:30015/?databaseName=system&user=SYSTEM&password=Saphana123",
"dbtable" -> "SYSTEM.TEST1"));
3. df.show();

*I get below exception on calling any action on dataframe object.*

*org.apache.spark.SparkException: Job aborted due to stage failure: Task not
serializable: java.io.NotSerializableException:
com.sap.db.jdbc.topology.Host*
Serialization stack:
        - object not serializable (class: com.sap.db.jdbc.topology.Host, value:
172.26.52.54:30015)
        - writeObject data (class: java.util.ArrayList)
        - object (class java.util.ArrayList, [172.26.52.54:30015])
        - writeObject data (class: java.util.Hashtable)
        - object (class java.util.Properties, 
{dburl=jdbc:sap://172.26.52.54:30015,
user=SYSTEM, password=Saphana123,
url=jdbc:sap://172.26.52.54:30015/?system&user=SYSTEM&password=Saphana123,
dbtable=SYSTEM.TEST1, hostlist=[172.26.52.54:30015]})


Caused by: java.io.NotSerializableException: com.sap.db.jdbc.topology.Host
Serialization stack:
        - object not serializable (class: com.sap.db.jdbc.topology.Host, value:
172.26.52.54:30015)
        - writeObject data (class: java.util.ArrayList)
        - object (class java.util.ArrayList, [172.26.52.54:30015])
        - writeObject data (class: java.util.Hashtable)
        - object (class java.util.Properties, 
{dburl=jdbc:sap://172.26.52.54:30015,
user=SYSTEM, password=Saphana123,
url=jdbc:sap://172.26.52.54:30015/?system&user=SYSTEM&password=Saphana123,
dbtable=SYSTEM.TEST1, hostlist=[172.26.52.54:30015]})


Appreciate help on this.
Thank you



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-execute-query-on-SAPHANA-using-SPARK-tp26628.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to