+1
I verified that the REPL jars published work fine with the Spark Kernel
project (can build/test against them).
Signed,
Chip Senkbeil
From: Krishna Sankar
To: Sean Owen
Cc: Patrick Wendell , "dev@spark.apache.org"
Date: 01/28/2015 02:52 PM
Subject:Re: [VOT
ns to connect to the
Spark Kernel without needing to implement the ZeroMQ protocol.
Signed,
Chip Senkbeil
From: Sam Bessalah
To: Robert C Senkbeil/Austin/IBM@IBMUS
Date: 12/12/2014 04:20 PM
Subject:Re: IBM open-sources Spark Kernel
Wow. Thanks. Can't wait to try this out.
We are happy to announce a developer preview of the Spark Kernel which
enables remote applications to dynamically interact with Spark. You can
think of the Spark Kernel as a remote Spark Shell that uses the IPython
notebook interface to provide a common entrypoint for any application. The
Spark
Hi there,
I wanted to ask whether or not anyone has successfully used Jython with the
pyspark library. I wasn't sure if the C extension support was needed for
pyspark itself or was just a bonus of using Cython.
There was a claim (
http://apache-spark-developers-list.1001551.n3.nabble.com/PySpar
I've created a new pull request, which can be found at
https://github.com/apache/spark/pull/1929. Since Spark is using Scala
2.10.3 and there is a known issue with Scala 2.10.x not supporting the :cp
command (https://issues.scala-lang.org/browse/SI-6502), the Spark shell
does not have the ability