Hi,

How did you install Spark 1.6? It's usually as simple as rm -rf
$SPARK_1.6_HOME, but it really depends on how you installed it in the
first place.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Sep 25, 2016 at 4:32 PM, vr spark <vrspark...@gmail.com> wrote:
> yes, i have both spark 1.6 and spark 2.0.
> I unset the spark home environment variable and pointed spark submit to 2.0.
> Its working now.
>
> How do i uninstall/remove spark 1.6 from mac?
>
> Thanks
>
>
> On Sun, Sep 25, 2016 at 4:28 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>>
>> Hi,
>>
>> Can you execute run-example SparkPi with your Spark installation?
>>
>> Also, see the logs:
>>
>> 16/09/24 23:15:15 WARN Utils: Service 'SparkUI' could not bind on port
>> 4040. Attempting port 4041.
>>
>> 16/09/24 23:15:15 INFO Utils: Successfully started service 'SparkUI'
>> on port 4041.
>>
>> You've got two Spark runtimes up that may or may not contribute to the
>> issue.
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sun, Sep 25, 2016 at 8:36 AM, vr spark <vrspark...@gmail.com> wrote:
>> > Hi,
>> > I have this simple scala app which works fine when i run it as scala
>> > application from the scala IDE for eclipse.
>> > But when i export is as jar and run it from spark-submit i am getting
>> > below
>> > error. Please suggest
>> >
>> > bin/spark-submit --class com.x.y.vr.spark.first.SimpleApp test.jar
>> >
>> > 16/09/24 23:15:15 WARN Utils: Service 'SparkUI' could not bind on port
>> > 4040.
>> > Attempting port 4041.
>> >
>> > 16/09/24 23:15:15 INFO Utils: Successfully started service 'SparkUI' on
>> > port
>> > 4041.
>> >
>> > 16/09/24 23:15:15 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at
>> > http://192.168.1.3:4041
>> >
>> > 16/09/24 23:15:15 INFO SparkContext: Added JAR
>> > file:/Users/vr/Downloads/spark-2.0.0/test.jar at
>> > spark://192.168.1.3:59263/jars/test.jar with timestamp 1474784115210
>> >
>> > 16/09/24 23:15:15 INFO Executor: Starting executor ID driver on host
>> > localhost
>> >
>> > 16/09/24 23:15:15 INFO Utils: Successfully started service
>> > 'org.apache.spark.network.netty.NettyBlockTransferService' on port
>> > 59264.
>> >
>> > 16/09/24 23:15:15 INFO NettyBlockTransferService: Server created on
>> > 192.168.1.3:59264
>> >
>> > 16/09/24 23:15:16 INFO TaskSetManager: Starting task 0.0 in stage 0.0
>> > (TID
>> > 0, localhost, partition 0, PROCESS_LOCAL, 5354 bytes)
>> >
>> > 16/09/24 23:15:16 INFO TaskSetManager: Starting task 1.0 in stage 0.0
>> > (TID
>> > 1, localhost, partition 1, PROCESS_LOCAL, 5354 bytes)
>> >
>> > 16/09/24 23:15:16 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
>> >
>> > 16/09/24 23:15:16 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
>> >
>> > 16/09/24 23:15:16 INFO Executor: Fetching
>> > spark://192.168.1.3:59263/jars/test.jar with timestamp 1474784115210
>> >
>> > 16/09/24 23:16:31 INFO Executor: Fetching
>> > spark://192.168.1.3:59263/jars/test.jar with timestamp 1474784115210
>> >
>> > 16/09/24 23:16:31 ERROR Executor: Exception in task 1.0 in stage 0.0
>> > (TID 1)
>> >
>> > java.io.IOException: Failed to connect to /192.168.1.3:59263
>> >
>> > at
>> >
>> > org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:228)
>> >
>> > at
>> >
>> > org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:179)
>> >
>> > at
>> >
>> > org.apache.spark.rpc.netty.NettyRpcEnv.downloadClient(NettyRpcEnv.scala:358)
>> >
>> > at
>> > org.apache.spark.rpc.netty.NettyRpcEnv.openChannel(NettyRpcEnv.scala:324)
>> >
>> > at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:633)
>> >
>> > at org.apache.spark.util.Utils$.fetchFile(Utils.scala:459)
>> >
>> > at
>> >
>> > org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:488)
>> >
>> > at
>> >
>> > org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:480)
>> >
>> > at
>> >
>> > scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
>> >
>> > at
>> >
>> > scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
>> >
>> > at
>> >
>> > scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
>> >
>> > at
>> >
>> > scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
>> >
>> > at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
>> >
>> > at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
>> >
>> > at
>> >
>> > scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
>> >
>> > at
>> >
>> > org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:480)
>> >
>> > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:252)
>> >
>> > at
>> >
>> > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> >
>> > at
>> >
>> > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> >
>> > at java.lang.Thread.run(Thread.java:745)
>> >
>> >
>> >
>> >
>> >
>> > My Scala code
>> >
>> >
>> > package com.x.y.vr.spark.first
>> >
>> > /* SimpleApp.scala */
>> >
>> > import org.apache.spark.SparkContext
>> >
>> > import org.apache.spark.SparkContext._
>> >
>> > import org.apache.spark.SparkConf
>> >
>> > object SimpleApp {
>> >
>> >   def main(args: Array[String]) {
>> >
>> >     val logFile = "/Users/vttrich/Downloads/spark-2.0.0/README.md" //
>> > Should
>> > be some file on your system
>> >
>> >     val conf = new SparkConf().setAppName("Simple Application")
>> >
>> >     val sc = new SparkContext("local[*]", "RatingsCounter")
>> >
>> >     //val sc = new SparkContext(conf)
>> >
>> >     val logData = sc.textFile(logFile, 2).cache()
>> >
>> >     val numAs = logData.filter(line => line.contains("a")).count()
>> >
>> >     val numBs = logData.filter(line => line.contains("b")).count()
>> >
>> >     println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
>> >
>> >   }
>> >
>> > }
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to