Re: [VOTE] Release Apache Spark 1.2.1 (RC2)

2015-01-29 Thread Robert C Senkbeil
+1 I verified that the REPL jars published work fine with the Spark Kernel project (can build/test against them). Signed, Chip Senkbeil From: Krishna Sankar To: Sean Owen Cc: Patrick Wendell , "dev@spark.apache.org" Date: 01/28/2015 02:52 PM Subject:Re: [VOT

Re: IBM open-sources Spark Kernel

2014-12-12 Thread Robert C Senkbeil
ns to connect to the Spark Kernel without needing to implement the ZeroMQ protocol. Signed, Chip Senkbeil From: Sam Bessalah To: Robert C Senkbeil/Austin/IBM@IBMUS Date: 12/12/2014 04:20 PM Subject:Re: IBM open-sources Spark Kernel Wow. Thanks. Can't wait to try this out.

IBM open-sources Spark Kernel

2014-12-12 Thread Robert C Senkbeil
We are happy to announce a developer preview of the Spark Kernel which enables remote applications to dynamically interact with Spark. You can think of the Spark Kernel as a remote Spark Shell that uses the IPython notebook interface to provide a common entrypoint for any application. The Spark

Jython importing pyspark?

2014-10-05 Thread Robert C Senkbeil
Hi there, I wanted to ask whether or not anyone has successfully used Jython with the pyspark library. I wasn't sure if the C extension support was needed for pyspark itself or was just a bonus of using Cython. There was a claim ( http://apache-spark-developers-list.1001551.n3.nabble.com/PySpar

Added support for :cp to the Spark Shell

2014-08-13 Thread Robert C Senkbeil
I've created a new pull request, which can be found at https://github.com/apache/spark/pull/1929. Since Spark is using Scala 2.10.3 and there is a known issue with Scala 2.10.x not supporting the :cp command (https://issues.scala-lang.org/browse/SI-6502), the Spark shell does not have the ability