Hello,

  I am trying to use spark in such a scenario:

  I have code written in Hadoop and now I try to migrate to Spark. The 
mappers and reducers are fairly complex. So I wonder if I can reuse the 
map() functions I already wrote in Hadoop (Java), and use Spark to chain 
them, mixing the Java map() functions with Spark operators?

  Another related question, can I use binary as operators, like Hadoop 
streaming?

  Thanks!
Wei

 

Reply via email to