Thanks for reply, Jean In my project , I'm working on higher abstraction layer of spark streaming to build a data processing product and trying to provide a common api for java and scala developers. You can see the abstract class defined here: https://github.com/InterestingLab/waterdrop/blob/master/waterdrop-apis/src/main/scala/io/github/interestinglab/waterdrop/apis/BaseStreamingInput.scal <https://github.com/InterestingLab/waterdrop/blob/master/waterdrop-apis/src/main/scala/io/github/interestinglab/waterdrop/apis/BaseStreamingInput.scala>
There is a method , getDStream, that return a DStream[T], which currently support scala class to extend this class and override getDStream, But I also want java class to extend this class to return a JavaDStream. This is my real problem. Tell me if the above description is not clear, because English is not my native language. Thanks in advance Gary On Tue, May 14, 2019 at 11:06 PM Jean Georges Perrin <j...@jgp.net> wrote: > There are a little bit more than the list you specified nevertheless, > some data types are not directly compatible between Scala and Java and > requires conversion, so it’s good to not pollute your code with plenty of > conversion and focus on using the straight API. > > I don’t remember from the top of my head, but if you use more Spark 2 > features (dataframes, structured streaming...) you will require less of > those Java-specific API. > > Do you see a problem here? What’s your take on this? > > jg > > > On May 14, 2019, at 10:22, Gary Gao <garygaow...@gmail.com> wrote: > > Hi all, > > I am wondering why do we need Java-Friendly APIs in Spark ? Why can't we > just use scala apis in java codes ? What's the difference ? > > Some examples of Java-Friendly APIs commented in Spark code are as follows: > > JavaDStream > JavaInputDStream > JavaStreamingContext > JavaSparkContext > >