Hi all, I cannot start thrift server on spark 1.6.2
I've configured binding port and IP and left default metastore.
In logs I get:

16/07/11 22:51:46 INFO NettyBlockTransferService: Server created on 46717
16/07/11 22:51:46 INFO BlockManagerMaster: Trying to register BlockManager
16/07/11 22:51:46 INFO BlockManagerMasterEndpoint: Registering block
manager 10.0.2.15:46717 with 511.1 MB RAM, BlockManagerId(driver,
10.0.2.15, 46717)
16/07/11 22:51:46 INFO BlockManagerMaster: Registered BlockManager
16/07/11 22:51:46 INFO AppClient$ClientEndpoint: Executor updated:
app-20160711225146-0000/0 is now RUNNING
16/07/11 22:51:47 ERROR SparkContext: Error initializing SparkContext.
java.io.FileNotFoundException: File file:/tmp/spark-events does not exist
        at
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:534)
        at
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:747)
        at
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:524)
        at
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:409)
        at
org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:549)
        at
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:56)
        at
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:76)
        at
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/07/11 22:51:47 INFO SparkUI: Stopped Spark web UI at http://localdev:4040

Did anyone found a similar issue? Any suggestion on the root cause?

Thanks to all!

Reply via email to