Re: How to use 'insert overwrite [local] directory' correctly?

2018-08-27 Thread Bang Xiao
solve the problem by create directory on hdfs before execute the sql. but i met a new error when i use : INSERT OVERWRITE LOCAL DIRECTORY '/search/odin/test' row format delimited FIELDS TERMINATED BY '\t' select vrid, query, url, loc_city from custom.common_wap_vr where logdate >= '2018073000' an

Re: How to use 'insert overwrite [local] directory' correctly?

2018-08-27 Thread Bang Xiao
Spark needs to create a directory first, while hive can automatically create directory. -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

How to use 'insert overwrite [local] directory' correctly?

2018-08-27 Thread Bang Xiao
Spark-2.3.0 support INSERT OVERWRITE DIRECTORY to directly write data into the filesystem from a query. I have met a problem with sql "INSERT OVERWRITE DIRECTORY '/tmp/test-insert-spark' select vrid, query, url, loc_city from custom.common_wap_vr where logdate >= '2018073000' and logdate <= '20

AM restart in a other node make SparkSQL job into a state of feign death

2017-12-20 Thread Bang Xiao
I run "spark-sql --master yarn --deploy-mode client -f 'SQLs' " in shell, The application is stuck when the AM is down and restart in other nodes. It seems the driver wait for the next sql. Is this a bug?In my opinion,Either the application execute the failed sql or exit with a failure when the

AM restart in a other node makes SparkSQL jobs into a state of feign death

2017-12-20 Thread Bang Xiao
I run "spark-sql --master yarn --deploy-mode client -f 'SQLs' " in shell, The application is stuck when the AM is down and restart in other nodes. It seems the driver wait for the next sql. Is this a bug?In my opinion,Either the application execute the failed sql or exit with a failure when the