On Tue, Jun 3, 2014 at 11:51 AM, Patrick Wendell wrote:
> Hey All,
>
> I wanted to announce the the Spark 1.1 release window:
> June 1 - Merge window opens
> July 25 - Cut-off for new pull requests
> August 1 - Merge window closes (code freeze), QA period starts
> August 15+ - RC's and voting
>
>
In that case, does it work if you use snappy instead of lzf ?
Regards,
Mridul
On Mon, Jun 16, 2014 at 7:34 AM, gchen wrote:
> To anyone who is interested in this issue, the root cause if from a third
> party code com.ning.compress.lzf.impl.UnsafeChunkEncoderBE class since they
> have a broken
Hi Reynold, thanks for your interest on this issue. The work here is part of
incorporating Spark into PowerLinux ecosystem.
Here is the bug raised in ning by my colleague:
https://github.com/ning/compress/issues/37
Would you mind to share whether some insights of Spark's support for Big
Enidan A
I think you guys are / will be leading the effort on that :)
On Mon, Jun 16, 2014 at 4:15 PM, gchen wrote:
> Hi Reynold, thanks for your interest on this issue. The work here is part
> of
> incorporating Spark into PowerLinux ecosystem.
>
> Here is the bug raised in ning by my colleague:
> htt
Surely the community's kind support is essential:)
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Big-Endian-IBM-Power7-Spark-Serialization-issue-tp7003p7018.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
I didn't find ning's source code in Spark git repository (or maybe I missed
it?), so next time when we meet bug caused by third party code, can we do
something (to fix the bug) based on the Spark repository?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.
It is here:
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/io/CompressionCodec.scala
On Mon, Jun 16, 2014 at 4:26 PM, gchen wrote:
> I didn't find ning's source code in Spark git repository (or maybe I missed
> it?), so next time when we meet bug caused by thir
hi, I encounter jvm problem when integreation spark with mesos,
here is the log when i run "spark-shell":
-48ce131dc5af
14/06/17 12:24:55 INFO HttpServer: Starting HTTP Server
14/06/17 12:24:55 INFO SparkUI: Started Spark Web UI at
http://bigdata001:4040
#
# A fatal error has been detected by the
I can't run sbt/sbt gen-idea on a clean checkout of Spark master.
I get resolution errors on junit#junit;4.10!junit.zip(source)
As shown below:
aash@aash-mbp /tmp/git/spark$ sbt/sbt gen-idea
Using /Library/Java/JavaVirtualMachines/jdk1.7.0_45.jdk/Contents/Home as
default JAVA_HOME.
Note, this wi
Hi qingyang,
This looks like an issue with the open source version of the Java runtime
(called OpenJDK) that causes the JVM to fail. Can you try using the JVM
released by Oracle and see if it has the same issue?
Thanks!
Andrew
On Mon, Jun 16, 2014 at 9:24 PM, qingyang li
wrote:
> hi, I encou
I used the same command on Linux and it passed:
Linux k.net 2.6.32-220.23.1.el6.YAHOO.20120713.x86_64 #1 SMP Fri Jul 13
11:40:51 CDT 2012 x86_64 x86_64 x86_64 GNU/Linux
Cheers
On Mon, Jun 16, 2014 at 9:29 PM, Andrew Ash wrote:
> I can't run sbt/sbt gen-idea on a clean checkout of Spark master
Maybe it's a Mac OS X thing?
On Mon, Jun 16, 2014 at 9:57 PM, Ted Yu wrote:
> I used the same command on Linux and it passed:
>
> Linux k.net 2.6.32-220.23.1.el6.YAHOO.20120713.x86_64 #1 SMP Fri Jul 13
> 11:40:51 CDT 2012 x86_64 x86_64 x86_64 GNU/Linux
>
> Cheers
>
>
> On Mon, Jun 16, 2014 at 9
12 matches
Mail list logo