Hi Siva
In that case u can use structured streaming foreach / foreachBatch function
which can help you process each record and write it into some sink
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To u
Hi Jainshasha,
I need to read each row from Dataframe and made some changes to it before
inserting it into ES.
Thanks
Siva
On Mon, Oct 5, 2020 at 8:06 PM jainshasha wrote:
> Hi Siva
>
> To emit data into ES using spark structured streaming job you need to used
> ElasticSearch jar which has sup
Hi Siva
To emit data into ES using spark structured streaming job you need to used
ElasticSearch jar which has support for sink for spark structured streaming
job. For this you can use this one my branch where we have integrated ES
with spark 3.0 and scala 2.12 compatible
https://github.com/Thales