I did a quick google search for "Java/Scala interoperability" and was surprised to find very few recent results on the topic. (Has the world given up?)
It's easy to use Java library code from Scala, but the opposite is not true. I would think about the problem this way: Do *YOU* need to provide a Java API in your product? If you decide to support both, beware the Law of Leaky Abstractions <https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-abstractions/> and look at what the Spark team came up with. (DataFrames in version 2.0 target this same problem (among others) - to provide a single abstraction that works across Scala, Java, Python, and R. But what they came up with required the APIs you list to make it work.) Think carefully about what new things you're trying to provide and what things you're trying to hide beneath your abstraction. HTH Jason On Wed, May 15, 2019 at 8:24 AM Jean-Georges Perrin <j...@jgp.net> wrote: > I see… Did you consider Structure Streaming? > > Otherwise, you could create a factory that will build your higher level > object, that will return an interface defining your API, but the > implementation may vary based on the context. > > And English is not my native language as well... > > Jean -Georges Perrin > j...@jgp.net > > > > > On May 14, 2019, at 21:47, Gary Gao <garygaow...@gmail.com> wrote: > > Thanks for reply, Jean > In my project , I'm working on higher abstraction layer of spark > streaming to build a data processing product and trying to provide a common > api for java and scala developers. > You can see the abstract class defined here: > https://github.com/InterestingLab/waterdrop/blob/master/waterdrop-apis/src/main/scala/io/github/interestinglab/waterdrop/apis/BaseStreamingInput.scal > <https://github.com/InterestingLab/waterdrop/blob/master/waterdrop-apis/src/main/scala/io/github/interestinglab/waterdrop/apis/BaseStreamingInput.scala> > > > There is a method , getDStream, that return a DStream[T], which > currently support scala class to extend this class and override getDStream, > But I also want java class to extend this class to return a JavaDStream. > This is my real problem. > Tell me if the above description is not clear, because English is > not my native language. > > Thanks in advance > Gary > > On Tue, May 14, 2019 at 11:06 PM Jean Georges Perrin <j...@jgp.net> wrote: > >> There are a little bit more than the list you specified nevertheless, >> some data types are not directly compatible between Scala and Java and >> requires conversion, so it’s good to not pollute your code with plenty of >> conversion and focus on using the straight API. >> >> I don’t remember from the top of my head, but if you use more Spark 2 >> features (dataframes, structured streaming...) you will require less of >> those Java-specific API. >> >> Do you see a problem here? What’s your take on this? >> >> jg >> >> >> On May 14, 2019, at 10:22, Gary Gao <garygaow...@gmail.com> wrote: >> >> Hi all, >> >> I am wondering why do we need Java-Friendly APIs in Spark ? Why can't we >> just use scala apis in java codes ? What's the difference ? >> >> Some examples of Java-Friendly APIs commented in Spark code are as >> follows: >> >> JavaDStream >> JavaInputDStream >> JavaStreamingContext >> JavaSparkContext >> >> > -- Thanks, Jason