d streaming data frame with another streaming data frame is not
>>> supported
>>>
>>>
>>>
>>>
>>>
>>> *From: *spark receiver
>>> *Date: *Friday, April 13, 2018 at 11:49 PM
>>> *To: *Aakash Basu
>>> *Cc: *Panagiotis
akash Basu
>> *Cc: *Panagiotis Garefalakis , user <
>> user@spark.apache.org>
>> *Subject: *Re: [Structured Streaming] More than 1 streaming in a code
>>
>>
>>
>> Hi Panagiotis ,
>>
>>
>>
>> Wondering you solved the problem or not?
You could have a really large window.
From: Aakash Basu
Date: Monday, April 16, 2018 at 10:56 AM
To: "Lalwani, Jayesh"
Cc: spark receiver , Panagiotis Garefalakis
, user
Subject: Re: [Structured Streaming] More than 1 streaming in a code
If I use timestamp based windowing, then
me with another streaming data frame is not
> supported
>
>
>
>
>
> *From: *spark receiver
> *Date: *Friday, April 13, 2018 at 11:49 PM
> *To: *Aakash Basu
> *Cc: *Panagiotis Garefalakis , user <
> user@spark.apache.org>
> *Subject: *Re: [Structured Strea
, April 16, 2018 at 4:52 AM
To: "Lalwani, Jayesh"
Cc: spark receiver , Panagiotis Garefalakis
, user
Subject: Re: [Structured Streaming] More than 1 streaming in a code
Hey Jayesh and Others,
Is there then, any other way to come to a solution for this use-case?
Thanks,
Aakash.
On M
*Cc: *Panagiotis Garefalakis , user <
> user@spark.apache.org>
> *Subject: *Re: [Structured Streaming] More than 1 streaming in a code
>
>
>
> Hi Panagiotis ,
>
>
>
> Wondering you solved the problem or not? Coz I met the same issue today.
> I’d appreci
PM
To: Aakash Basu
Cc: Panagiotis Garefalakis , user
Subject: Re: [Structured Streaming] More than 1 streaming in a code
Hi Panagiotis ,
Wondering you solved the problem or not? Coz I met the same issue today. I’d
appreciate so much if you could paste the code snippet if it’s working
Hi Panagiotis ,
Wondering you solved the problem or not? Coz I met the same issue today. I’d
appreciate so much if you could paste the code snippet if it’s working .
Thanks.
> 在 2018年4月6日,上午7:40,Aakash Basu 写道:
>
> Hi Panagiotis,
>
> I did that, but it still prints the result of the first
Hi Panagiotis,
I did that, but it still prints the result of the first query and awaits
for new data, doesn't even goes to the next one.
*Data -*
$ nc -lk 9998
1,2
3,4
5,6
7,8
*Result -*
---
Batch: 0
---
++
|a
Hello Aakash,
When you use query.awaitTermination you are pretty much blocking there
waiting for the current query to stop or throw an exception. In your case
the second query will not even start.
What you could do instead is remove all the blocking calls and use
spark.streams.awaitAnyTermination
10 matches
Mail list logo