Apache gives exception when running groupby on df temp table

2015-07-16 Thread nipun
I have a dataframe. I register it as a temp table and run a spark sql query on it to get another dataframe. Now when I run groupBy on it, it gives me this exception e: Lost task 1.3 in stage 21.0 (TID 579, 172.28.0.162): java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.s

Re: Apache gives exception when running groupby on df temp table

2015-07-17 Thread nipun
spark version 1.4 import com.datastax.spark.connector._ import org.apache.spark._ import org.apache.spark.sql.cassandra.CassandraSQLContext import org.apache.spark.SparkConf //import com.microsoft.sqlserver.jdbc.SQLServerDriver import java.sql.Connection import java.sql.DriverManager import java.

Re: Apache gives exception when running groupby on df temp table

2015-07-17 Thread nipun
You are right. There are some odd things about this connector. Earlier I got OOM exception with this connector just because there was a bug in the connector which transferred only 64 bytes before closing the connection and now this one Strangely I copied the data into another data frame and it work

[Spark Streaming] Dynamic Broadcast Variable Update

2017-05-02 Thread Nipun Arora
d like to integrate these changes in spark, can anyone please let me know the process of submitting patches/ new features to spark. Also. I understand that the current version of Spark is 2.1. However, our changes have been done and tested on Spark 1.6.2, will this be a problem? Thanks Nipun

[Spark Streaming] Dynamic Broadcast Variable Update

2017-05-05 Thread Nipun Arora
d like to integrate these changes in spark, can anyone please let me know the process of submitting patches/ new features to spark. Also. I understand that the current version of Spark is 2.1. However, our changes have been done and tested on Spark 1.6.2, will this be a problem? Thanks Nipun

BUG: 1.3.0 org.apache.spark.sql.Row Does not exist in Java API

2015-04-17 Thread Nipun Batra
;); // The results of SQL queries are DataFrames and support all the normal RDD operations.// The columns of a row in the result can be accessed by ordinal.List names = results.map(new Function() { public String call(Row row) { return "Name: " + row.getString(0); } }).collect(); Thanks Nipun