You are right. There are some odd things about this connector. Earlier I got OOM exception with this connector just because there was a bug in the connector which transferred only 64 bytes before closing the connection and now this one Strangely I copied the data into another data frame and it worked on the new dataframe val dfCol1 = dfCol.limit(dfCol.count.toInt) and now groupby worked
Can you tel me whats the difference between using cassandrasqlcontext and the sqlcontext with loading the cassandra driver. I think internally cassandra is also loading the same driver right? -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-gives-exception-when-running-groupby-on-df-temp-table-tp13275p13289.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org