Hi Jay,
The recommended way to build spark from source is through the maven system.
You would want to follow the steps in
https://spark.apache.org/docs/latest/building-with-maven.html to set the
MAVEN_OPTS to prevent OOM build errors.
Thanks
On Tue, Aug 26, 2014 at 5:49 PM, jay vyas
wrote:
> H
On Mon, Aug 4, 2014 at 1:01 PM, Anand Avati wrote:
>
>
>
> On Sun, Aug 3, 2014 at 9:09 PM, Patrick Wendell
> wrote:
>
>> Hey Anand,
>>
>> Thanks for looking into this - it's great to see momentum towards Scala
>> 2.11 and I'd love if this la
On Sun, Aug 3, 2014 at 9:09 PM, Patrick Wendell wrote:
> Hey Anand,
>
> Thanks for looking into this - it's great to see momentum towards Scala
> 2.11 and I'd love if this land in Spark 1.2.
>
> For the external dependencies, it would be good to create a sub-task of
> SPARK-1812 to track our effo
We are currently blocked on non availability of the following external
dependencies for porting Spark to Scala 2.11 [SPARK-1812 Jira]:
- akka-*_2.11 (2.3.4-shaded-protobuf from org.spark-project). The shaded
protobuf needs to be 2.5.0, and the shading is needed because Hadoop1
specifically needs p
I am bumping into this problem as well. I am trying to move to akka 2.3.x
from 2.2.x in order to port to Scala 2.11 - only akka 2.3.x is available in
Scala 2.11. All 2.2.x akka works fine, and all 2.3.x akka give the
following exception in "new SparkContext". Still investigating why..
java.util.
On Tue, May 13, 2014 at 8:26 AM, Michael Malak wrote:
> Reposting here on dev since I didn't see a response on user:
>
> I'm seeing different Serializable behavior in Spark Shell vs. Scala Shell.
> In the Spark Shell, equals() fails when I use the canonical equals()
> pattern of match{}, but works
11?
>
Currently fighting to get all the dependencies in 2.11. Quick pointer where
I can get sources for akka-*-X.Y-shared-protobuf? Also, what's the smallest
set of dependencies to build the smallest testable subset of the project?
Thanks!
> Matei
>
> On May 12, 2014, at 2:0
Hi,
Can someone share the reason why Kryo serializer is not the default? Is
there anything to be careful about (because of which it is not enabled by
default)?
Thanks!
ersion).
>
> Matei
>
> On May 8, 2014, at 6:33 PM, Anand Avati wrote:
>
> > Is there an ongoing effort (or intent) to support Spark on Scala 2.11?
> > Approximate timeline?
> >
> > Thank
>
Is there an ongoing effort (or intent) to support Spark on Scala 2.11?
Approximate timeline?
Thanks
10 matches
Mail list logo