Re: [Structured Streaming] Reuse computation result

2018-02-01 Thread Sandip Mehta
You can use persist() or cache() operation on DataFrame. On Tue, Dec 26, 2017 at 4:02 PM Shu Li Zheng wrote: > Hi all, > > I have a scenario like this: > > val df = dataframe.map().filter() > // agg 1 > val query1 = df.sum.writeStream.start > // agg 2 > val query2 = df.count.writeStream.start >

Re: [Structured Streaming] Reuse computation result

2017-12-29 Thread Lalwani, Jayesh
There is no way to solve this within spark. One option you could do is break up your application into multiple application. First application can filter and write the filtered results into a kafka queue. Second application can read from queue and sum. Third application can read from queue and d