ark and Spark streaming, click
> here.
> NAML
--
--
Alessandro Finamore, PhD
Politecnico di Torino
--
Office: +39 0115644127
Mobile: +39 3280251485
SkypeId: alessandro.finamore
---
View this message in context: Re: window analysis with Spark and Spark
streaming
Sent from the Apache Spark User List mailing list archive at Nabble.com.
On 5 July 2014 23:08, Mayur Rustagi [via Apache Spark User List]
wrote:
> Key idea is to simulate your app time as you enter data . So you can connect
> spark streaming to a queue and insert data in it spaced by time. Easier said
> than done :).
I see.
I'll try to implement also this solution so
Key idea is to simulate your app time as you enter data . So you can
connect spark streaming to a queue and insert data in it spaced by time.
Easier said than done :). What are the parallelism issues you are hitting
with your static approach.
On Friday, July 4, 2014, alessandro finamore
wrote:
>
The windowing capabilities of spark streaming determine the events in the RDD
created for that time window. If the duration is 1s then all the events
received in a particular 1s window will be a part of the RDD created for that
window for that stream.
On Friday, July 4, 2014 1:28 PM, alessan
Thanks for the replies
What is not completely clear to me is how time is managed.
I can create a DStream from file.
But if I set the window property that will be bounded to the application
time, right?
If I got it right, with a receiver I can control the way DStream are
created.
But, how can appl
Another alternative could be use SparkStreaming's textFileStream with windowing
capabilities.
On Friday, July 4, 2014 9:52 AM, Gianluca Privitera
wrote:
You should think about a custom receiver, in order to solve the problem of the
“already collected” data.
http://spark.apache.org/docs
You should think about a custom receiver, in order to solve the problem of the
“already collected” data.
http://spark.apache.org/docs/latest/streaming-custom-receivers.html
Gianluca
On 04 Jul 2014, at 15:46, alessandro finamore
mailto:alessandro.finam...@polito.it>> wrote:
Hi,
I have a large