Hey Anand,

Thanks for looking into this - it's great to see momentum towards Scala
2.11 and I'd love if this land in Spark 1.2.

For the external dependencies, it would be good to create a sub-task of
SPARK-1812 to track our efforts encouraging other projects to upgrade. In
certain cases (e.g. Kafka) there is fairly late-stage work on this already,
so we can e.g. link to those JIRA's as well. A good starting point is to
just go to their dev list and ask what the status is, most Scala projects
have put at least some thought into this already. Another thing we can do
is submit patches ourselves to those projects to help get them upgraded.
The twitter libraries, e.g., tend to be pretty small and also open to
external contributions.

One other thing in the mix here - Prashant Sharma has also spent some time
looking at this, so it might be good for you two to connect (probably off
list) and sync up. Prashant has contributed to many Scala projects, so he
might have cycles to go and help some of our dependencies get upgraded -
but I won't commit to that on his behalf :).

Regarding Akka - I shaded and published akka as a one-off thing:
https://github.com/pwendell/akka/tree/2.2.3-shaded-proto

Over time we've had to publish our own versions of a small number of
dependencies. It's somewhat high overhead, but it actually works quite well
in terms of avoiding some of the nastier dependency conflicts. At least
better than other alternatives I've seen such as using a shader build
plug-in.

Going forward, I'd actually like to track these in the Spark repo itself.
For instance, we have a bash script in the spark repo that can e.g. check
out akka, apply a few patches or regular expressions, and then you have a
fully shaded dependency that can be published to maven. If you wanted to
take a crack at something like that for akka 2.3.4, be my guest. I can help
with the actual publishing.

- Patrick


On Sat, Aug 2, 2014 at 6:04 PM, Anand Avati <av...@gluster.org> wrote:

> We are currently blocked on non availability of the following external
> dependencies for porting Spark to Scala 2.11 [SPARK-1812 Jira]:
>
> - akka-*_2.11 (2.3.4-shaded-protobuf from org.spark-project). The shaded
> protobuf needs to be 2.5.0, and the shading is needed because Hadoop1
> specifically needs protobuf 2.4. Issues arising because of this
> incompatibility is already explained in SPARK-1812 Jira.
>
> - chill_2.11 (0.4 from com.twitter) for core
> - algebird_2.11 (0.7 from com.twitter) for examples
> - kafka_2.11 (0.8 from org.apache) for external/kafka and examples
> - akka-zeromq_2.11 (2.3.4 from com.typesafe, but probably not needed if a
> shaded-protobuf version is released from org.spark-project)
>
> First,
> Who do I pester to get org.spark-project artifacts published for the akka
> shaded-protobuf version?
>
> Second,
> In the past what has been the convention to request/pester external
> projects to re-release artifacts in a new Scala version?
>
> Thanks!
>

Reply via email to