ics with Spark
-Original Message-
From: bernh...@chapter7.ch [mailto:bernh...@chapter7.ch]
Sent: Tuesday, February 9, 2016 10:47 PM
To: Mohammed Guller
Cc: user@spark.apache.org
Subject: Re: [Spark Streaming] Joining Kafka and Cassandra DataFrames
Hi Mohammed
I'm aware of that docum
Spark
-Original Message-
From: bernh...@chapter7.ch [mailto:bernh...@chapter7.ch]
Sent: Tuesday, February 9, 2016 10:47 PM
To: Mohammed Guller
Cc: user@spark.apache.org
Subject: Re: [Spark Streaming] Joining Kafka and Cassandra DataFrames
Hi Mohammed
I'm aware of that documentation,
h...@chapter7.ch [mailto:bernh...@chapter7.ch]
Sent: Tuesday, February 9, 2016 10:05 PM
To: Mohammed Guller
Cc: user@spark.apache.org
Subject: Re: [Spark Streaming] Joining Kafka and Cassandra DataFrames
Hi Mohammed
Thanks for hint, I should probably do that :)
As for the DF singleton:
/**
* Lazily inst
rom: bernh...@chapter7.ch [mailto:bernh...@chapter7.ch]
Sent: Tuesday, February 9, 2016 10:05 PM
To: Mohammed Guller
Cc: user@spark.apache.org
Subject: Re: [Spark Streaming] Joining Kafka and Cassandra DataFrames
Hi Mohammed
Thanks for hint, I should probably do that :)
As for the DF
Hi Mohammed
Thanks for hint, I should probably do that :)
As for the DF singleton:
/**
* Lazily instantiated singleton instance of base_data DataFrame
*/
object base_data_df {
@transient private var instance: DataFrame = _
def getInstance(sqlContext: SQLContext): DataFrame = {
if (i
You may have better luck with this question on the Spark Cassandra Connector
mailing list.
One quick question about this code from your email:
// Load DataFrame from C* data-source
val base_data = base_data_df.getInstance(sqlContext)
What exactly is base_data_df and how are y