Try using spark-submit instead of spark-shell


On Tue, Oct 7, 2014 at 3:47 PM, spr <s...@yarcdata.com> wrote:

> I'm probably doing something obviously wrong, but I'm not seeing it.
>
> I have the program below (in a file try1.scala), which is similar but not
> identical to the examples.
>
> import org.apache.spark._
> import org.apache.spark.streaming._
> import org.apache.spark.streaming.StreamingContext._
>
> println("Point 0")
> val appName = "try1.scala"
> val master = "local[5]"
> val conf = new SparkConf().setAppName(appName).setMaster(master)
> val ssc = new StreamingContext(conf, Seconds(10))
> println("Point 1")
> val lines = ssc.textFileStream("/Users/spr/Documents/big_data/RSA2014/")
> println("Point 2")
> println("lines="+lines)
> println("Point 3")
>
> ssc.start()
> println("Point 4")
> ssc.awaitTermination()
> println("Point 5")
>
> I start the program via
>
> $S/bin/spark-shell --master local[5] try1.scala
>
> The messages I get are
>
> mbp-spr:cyber spr$ $S/bin/spark-shell --master local[5] try1.scala
> 14/10/07 17:36:58 INFO SecurityManager: Using Spark's default log4j
> profile:
> org/apache/spark/log4j-defaults.properties
> 14/10/07 17:36:58 INFO SecurityManager: Changing view acls to: spr
> 14/10/07 17:36:58 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(spr)
> 14/10/07 17:36:58 INFO HttpServer: Starting HTTP Server
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 1.0.2
>       /_/
>
> Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
> 1.6.0_65)
> Type in expressions to have them evaluated.
> Type :help for more information.
> 14/10/07 17:37:01 INFO SecurityManager: Changing view acls to: spr
> 14/10/07 17:37:01 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(spr)
> 14/10/07 17:37:01 INFO Slf4jLogger: Slf4jLogger started
> 14/10/07 17:37:01 INFO Remoting: Starting remoting
> 14/10/07 17:37:02 INFO Remoting: Remoting started; listening on addresses
> :[akka.tcp://spark@192.168.0.3:58351]
> 14/10/07 17:37:02 INFO Remoting: Remoting now listens on addresses:
> [akka.tcp://spark@192.168.0.3:58351]
> 14/10/07 17:37:02 INFO SparkEnv: Registering MapOutputTracker
> 14/10/07 17:37:02 INFO SparkEnv: Registering BlockManagerMaster
> 14/10/07 17:37:02 INFO DiskBlockManager: Created local directory at
>
> /var/folders/pk/2bm2rq8n0rv499w5s9_c_w6r00003b/T/spark-local-20141007173702-054c
> 14/10/07 17:37:02 INFO MemoryStore: MemoryStore started with capacity 303.4
> MB.
> 14/10/07 17:37:02 INFO ConnectionManager: Bound socket to port 58352 with
> id
> = ConnectionManagerId(192.168.0.3,58352)
> 14/10/07 17:37:02 INFO BlockManagerMaster: Trying to register BlockManager
> 14/10/07 17:37:02 INFO BlockManagerInfo: Registering block manager
> 192.168.0.3:58352 with 303.4 MB RAM
> 14/10/07 17:37:02 INFO BlockManagerMaster: Registered BlockManager
> 14/10/07 17:37:02 INFO HttpServer: Starting HTTP Server
> 14/10/07 17:37:02 INFO HttpBroadcast: Broadcast server started at
> http://192.168.0.3:58353
> 14/10/07 17:37:02 INFO HttpFileServer: HTTP File server directory is
>
> /var/folders/pk/2bm2rq8n0rv499w5s9_c_w6r00003b/T/spark-0950f667-aa04-4f6e-9d2e-5a9fab30806c
> 14/10/07 17:37:02 INFO HttpServer: Starting HTTP Server
> 14/10/07 17:37:02 INFO SparkUI: Started SparkUI at http://192.168.0.3:4040
> 2014-10-07 17:37:02.428 java[27725:1607] Unable to load realm mapping info
> from SCDynamicStore
> 14/10/07 17:37:02 INFO Executor: Using REPL class URI:
> http://192.168.0.3:58350
> 14/10/07 17:37:02 INFO SparkILoop: Created spark context..
> Spark context available as sc.
>
> Note no messages from any of my "println()" statements.
>
> I could understand that I'm possibly screwing up something in the code, but
> why am I getting no print-out at all.  ???   Suggestions?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/SparkStreaming-program-does-not-start-tp15876.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 
~

Reply via email to