Hello,

I am one of the committers for Apache NiFi (incubating). I am looking to 
integrate NiFi with Spark streaming. I have created a custom Receiver to 
receive data from NiFi. I’ve tested it locally, and things seem to work well.


I feel it would make more sense to have the NiFi Receiver in the Spark codebase 
along side the code for Flume, Kafka, etc., as this is where people are more 
likely to look to see what integrations are available. Looking there, though, 
it seems that all of those are “fully integrated” into Spark, rather than being 
simple Receivers.


Is Spark interested in housing the code for Receivers to interact with other 
services, or should this just reside in the NiFi codebase?


Thanks for any pointers

-Mark

Reply via email to