I don't think current Spark Streaming supports window operations which beyond 
its available memory, internally Spark Streaming puts all the data in the 
memory belongs to the effective window, if the memory is not enough, 
BlockManager will discard the blocks at LRU policy, so something unexpected 
will be occurred.

Thanks
Jerry

-----Original Message-----
From: avilevi3 [mailto:[email protected]] 
Sent: Monday, February 23, 2015 12:57 AM
To: [email protected]
Subject: spark streaming window operations on a large window size

Hi guys, 

does spark streaming supports window operations on a sliding window that is 
data is larger than the available memory?
we would like to
currently we are using kafka as input, but we could change that if needed.

thanks
Avi



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-window-operations-on-a-large-window-size-tp21764.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected] For additional 
commands, e-mail: [email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to