I suspect Patrick is right about the cause. The Maven artifact that
was released does contain this class (phew)

http://search.maven.org/#artifactdetails%7Corg.apache.spark%7Cspark-core_2.10%7C1.0.0%7Cjar

As to the hadoop1 / hadoop2 artifact question -- agree that is often
done. Here the working theory seems to be to depend on the one
artifact (whose API should be identical regardless of dependencies)
and then customize the hadoop-client dep. Here, there are not two
versions deployed to Maven at all.


On Sun, Jun 8, 2014 at 4:02 PM, Patrick Wendell <pwend...@gmail.com> wrote:
> Paul,
>
> Could you give the version of Java that you are building with and the
> version of Java you are running with? Are they the same?
>
> Just off the cuff, I wonder if this is related to:
> https://issues.apache.org/jira/browse/SPARK-1520
>
> If it is, it could appear that certain functions are not in the jar
> because they go beyond the extended zip boundary `jar tvf` won't list
> them.

Reply via email to