I believe that's a question to the NiFi list, as you can see the the code
base is quite old
https://github.com/apache/nifi/tree/master/nifi-external/nifi-spark-receiver/src/main/java/org/apache/nifi/spark
and it doesn't make use of the
https://github.com/apache/spark/blob/master/streaming/src/main/
Some more info
val lines = ssc.socketStream() // works
val lines = ssc.receiverStream(new NiFiReceiver(conf,
StorageLevel.MEMORY_AND_DISK_SER_2)) // does not work
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
https://www.facebook.com/allan.tuuring
+372 51 48 780
On 15/09/2017 21:5
Hi
I tested |spark.streaming.receiver.maxRate and
||spark.streaming.backpressure.enabled settings using socketStream and
it works.|
|But if I am using nifi-spark-receiver
(https://mvnrepository.com/artifact/org.apache.nifi/nifi-spark-receiver)
then it does not using |
||spark.streaming.rec
This might be related:
SPARK-6985
Cheers
On Wed, Jul 1, 2015 at 10:27 AM, Laeeq Ahmed
wrote:
> Hi,
>
> I have set "spark.streaming.receiver.maxRate" to "100". My batch interval
> is 4sec but still sometimes there are more than 400 records per batch. I am
> using spark 1.2.0.
>
> Regards,
> Lae