Yes, they both are set. Just recompiled and still no success, silent
failure.
Which versions of java and scala are you using?


On 17 August 2015 at 19:59, Charlie Hack <charles.t.h...@gmail.com> wrote:

> I had success earlier today on OSX Yosemite 10.10.4 building Spark 1.4.1
> using these instructions
> <http://genomegeek.blogspot.com/2014/11/how-to-install-apache-spark-on-mac-os-x.html>
>  (using
> `$ sbt/sbt clean assembly`, with the additional step of downloading the
> proper sbt-launch.jar (0.13.7) from here
> <http://dl.bintray.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.7/>
> and replacing the one that is in build/ as you noted. You've set SCALA_HOME
> and JAVA_HOME environment variables?
>
> On Mon, Aug 17, 2015 at 8:36 PM, Alun Champion <a...@achampion.net> wrote:
>
>> Has anyone experienced issues running Spark 1.4.1 on a Mac OSX Yosemite?
>>
>> I'm been running a standalone 1.3.1 fine but it failed when trying to run
>> 1.4.1. (I also trie 1.4.0).
>>
>> I've tried both the pre-built packages as well as compiling from source,
>> both with the same results (I can successfully compile with both mvn and
>> sbt (after fixing the sbt.jar - which was corrupt)
>> After downloading/building spark and running ./bin/pyspark or
>> ./bin/spark-shell it silently exits with a code 1.
>> Creating a context in python I get: Exception: Java gateway process
>> exited before sending the driver its port number
>>
>> I couldn't find any specific resolutions on the web.
>> I did add 'pyspark-shell' to the PYSPARK_SUBMIT_ARGS but to no effect.
>>
>> Anyone have any further ideas I can explore?
>> Cheers
>>    -Alun.
>>
>>
>
>
> --
> # +17344761472
>

Reply via email to