Hi

We would like to contribute to Spark but we are not sure how. We can offer
project management, release management to begin with. Please advice on how to
get engaged.

Thank you!!

Regards
Pritish
Nirvana International Inc.
Big Data, Hadoop, Oracle EBS and IT Solutions
VA - SWaM, MD - MBE Certified Company
prit...@nirvana-international.com
http://www.nirvana-international.com

> On August 2, 2014 at 9:04 PM Anand Avati <av...@gluster.org> wrote:
>
>
> We are currently blocked on non availability of the following external
> dependencies for porting Spark to Scala 2.11 [SPARK-1812 Jira]:
>
> - akka-*_2.11 (2.3.4-shaded-protobuf from org.spark-project). The shaded
> protobuf needs to be 2.5.0, and the shading is needed because Hadoop1
> specifically needs protobuf 2.4. Issues arising because of this
> incompatibility is already explained in SPARK-1812 Jira.
>
> - chill_2.11 (0.4 from com.twitter) for core
> - algebird_2.11 (0.7 from com.twitter) for examples
> - kafka_2.11 (0.8 from org.apache) for external/kafka and examples
> - akka-zeromq_2.11 (2.3.4 from com.typesafe, but probably not needed if a
> shaded-protobuf version is released from org.spark-project)
>
> First,
> Who do I pester to get org.spark-project artifacts published for the akka
> shaded-protobuf version?
>
> Second,
> In the past what has been the convention to request/pester external
> projects to re-release artifacts in a new Scala version?
>
> Thanks!

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to