I am running the same version of spark in the server (master + worker) and
in the client / driver.
For the server I am using the binaries spark-1.1.0-bin-hadoop1
And in the client I am using the same version:
> org.apache.spark
> spark-core_2.10
> 1.1.
Its more like you are having different versions of spark
Thanks
Best Regards
On Wed, Nov 5, 2014 at 3:05 AM, Saiph Kappa wrote:
> I set the host and port of the driver and now the error slightly changed
>
> Using Spark's default log4j profile:
>> org/apache/spark/log4j-defaults.properties
>> 14
I set the host and port of the driver and now the error slightly changed
Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 14/11/04 21:13:48 INFO CoarseGrainedExecutorBackend: Registered signal
> handlers for [TERM, HUP, INT]
> 14/11/04 21:13:48 INFO SecurityManag
If you want to run the spark application from a remote machine, then you
have to at least set the following configurations properly.
*spark.driver.host* - points to the ip/host from where you are submitting
the job (make sure you are able to ping this from the cluster)
*spark.driver.port* - set i