I try to do test on HiveSparkSubmitSuite on local box, but fails. The cause
is that spark is still using my local single node cluster hadoop when doing
the unit test. I don't think it make sense to do that. These environment
variable should be unset before the testing. And I suspect dev/run-tests
also
didn't do that either.

Here's the error message:

Cause: java.lang.RuntimeException: java.lang.RuntimeException: The root
scratch dir: /tmp/hive on HDFS should be writable. Current permissions are:
rwxr-xr-x
[info]   at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
[info]   at
org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
[info]   at
org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
[info]   at
org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)



-- 
Best Regards

Jeff Zhang

Reply via email to