The contents of spark-env.sh is :
SPARK_MASTER_IP=marvin.spark.ins-01
SPARK_MASTER_PORT=7077
SPARK_MASTER_WEBUI_PORT=8080
SPARK_WORKER_WEBUI_PORT=8081
SPARK_WORKER_INSTANCES=1
SPARK_LOCAL_IP=marvin.spark.ins-01
The contents of etc/hosts is
172.28.161.33 marvin.base.ins-01
got it ..thank u...
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/using-JavaRDD-in-spark-redis-connector-tp14391p14812.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
-
No.. the ./sbin/start-master.sh --ip option did not work... It is still the
same error
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Unable-to-run-applications-on-spark-in-standalone-cluster-mode-tp14683p14779.html
Sent from the Apache Spark Develop
Hi all,
I am trying to work with spark-redis connector (redislabs) which
requires all transactions between redis and spark be in RDD's. The language
I am using is Java but the connector does not accept JavaRDD's .So I tried
using Spark context in my code instead of JavaSparkContext. But when