Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Aakash Basu
d streaming data frame with another streaming data frame is not >>> supported >>> >>> >>> >>> >>> >>> *From: *spark receiver >>> *Date: *Friday, April 13, 2018 at 11:49 PM >>> *To: *Aakash Basu >>> *Cc: *Panagiotis

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Gerard Maas
gTime='5 seconds') \ >> .start() >> >> query = wordCounts \ >> .writeStream \ >> .format("console") \ >> .trigger(processingTime='5 seconds') \ >> .start() >> >> spark.streams.awaitAnyTerminatio

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Lalwani, Jayesh
You could have a really large window. From: Aakash Basu Date: Monday, April 16, 2018 at 10:56 AM To: "Lalwani, Jayesh" Cc: spark receiver , Panagiotis Garefalakis , user Subject: Re: [Structured Streaming] More than 1 streaming in a code If I use timestamp based windowing, then

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Aakash Basu
< > panga...@gmail.com>, user > > *Subject: *Re: [Structured Streaming] More than 1 streaming in a code > > > > Hey Jayesh and Others, > > Is there then, any other way to come to a solution for this use-case? > > > > Thanks, > > Aakash. > > > &g

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Lalwani, Jayesh
, April 16, 2018 at 4:52 AM To: "Lalwani, Jayesh" Cc: spark receiver , Panagiotis Garefalakis , user Subject: Re: [Structured Streaming] More than 1 streaming in a code Hey Jayesh and Others, Is there then, any other way to come to a solution for this use-case? Thanks, Aakash. On M

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Aakash Basu
*Cc: *Panagiotis Garefalakis , user < > user@spark.apache.org> > *Subject: *Re: [Structured Streaming] More than 1 streaming in a code > > > > Hi Panagiotis , > > > > Wondering you solved the problem or not? Coz I met the same issue today. > I’d appreci

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-15 Thread Lalwani, Jayesh
PM To: Aakash Basu Cc: Panagiotis Garefalakis , user Subject: Re: [Structured Streaming] More than 1 streaming in a code Hi Panagiotis , Wondering you solved the problem or not? Coz I met the same issue today. I’d appreciate so much if you could paste the code snippet if it’s working

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-13 Thread spark receiver
. > What you could do instead is remove all the blocking calls and use > spark.streams.awaitAnyTermination instead (waiting for either query1 or > query2 to terminate). Make sure you do that after the query2.start call. > > I hope this helps. > > Cheers, > Panagiotis &

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-06 Thread Aakash Basu
treams.awaitAnyTermination instead (waiting for either query1 or > query2 to terminate). Make sure you do that after the query2.start call. > > I hope this helps. > > Cheers, > Panagiotis > > On Fri, Apr 6, 2018 at 11:23 AM, Aakash Basu > wrote: > >> Any help? >

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-06 Thread Panagiotis Garefalakis
- Forwarded message -- > From: Aakash Basu > Date: Thu, Apr 5, 2018 at 3:18 PM > Subject: [Structured Streaming] More than 1 streaming in a code > To: user > > > Hi, > > If I have more than one writeStream in a code, which operates on the same > readSt

Fwd: [Structured Streaming] More than 1 streaming in a code

2018-04-06 Thread Aakash Basu
Any help? Need urgent help. Someone please clarify the doubt? -- Forwarded message -- From: Aakash Basu Date: Thu, Apr 5, 2018 at 3:18 PM Subject: [Structured Streaming] More than 1 streaming in a code To: user Hi, If I have more than one writeStream in a code, which

[Structured Streaming] More than 1 streaming in a code

2018-04-05 Thread Aakash Basu
Hi, If I have more than one writeStream in a code, which operates on the same readStream data, why does it produce only the first writeStream? I want the second one to be also printed on the console. How to do that? from pyspark.sql import SparkSession from pyspark.sql.functions import split, co