Hi, all  

I just created a JIRA https://spark-project.atlassian.net/browse/SPARK-1139 . 
The issue discusses that:

the new Hadoop API based Spark APIs are actually a mixture of old and new 
Hadoop API.

Spark APIs are still using JobConf (or Configuration) as one of the parameters, 
but actually Configuration has been replace by mapreduce.Job in the new Hadoop 
API

for example : 
http://codesfusion.blogspot.ca/2013/10/hadoop-wordcount-with-new-map-reduce-api.html
  

&  

http://www.slideshare.net/sh1mmer/upgrading-to-the-new-map-reduce-api (p10)

Personally I think it’s better to fix this design, but it will introduce some 
compatibility issue  

Just bring it here for your advices

Best,  

--  
Nan Zhu

Reply via email to