Currently SparkSQL doesn’t support the row format/serde in CTAS. The work around is create the table first.
-----Original Message----- From: centerqi hu [mailto:cente...@gmail.com] Sent: Tuesday, September 02, 2014 3:35 PM To: user@spark.apache.org Subject: Unsupported language features in query hql("""CREATE TABLE tmp_adclick_gm_all ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' as SELECT SUM(uv) as uv, round(SUM(cost),2) as total, round(SUM(cost)/SUM(uv),2) FROM tmp_adclick_sellplat """) 14/09/02 15:32:28 INFO ParseDriver: Parse Completed java.lang.RuntimeException: Unsupported language features in query: CREATE TABLE tmp_adclick_gm_all ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY 'abc' as SELECT SUM(uv) as uv, round(SUM(cost),2) as total, round(SUM(cost)/SUM(uv),2) FROM tmp_adclick_sellplat TOK_CREATETABLE TOK_TABNAME tmp_adclick_gm_all TOK_LIKETABLE TOK_TABLEROWFORMAT TOK_SERDEPROPS TOK_TABLEROWFORMATFIELD ',' TOK_TABLEROWFORMATLINES 'abc' TOK_QUERY TOK_FROM TOK_TABREF TOK_TABNAME tmp_adclick_sellplat TOK_INSERT TOK_DESTINATION TOK_DIR TOK_TMP_FILE TOK_SELECT TOK_SELEXPR TOK_FUNCTION SUM TOK_TABLE_OR_COL uv uv TOK_SELEXPR TOK_FUNCTION round TOK_FUNCTION SUM TOK_TABLE_OR_COL cost 2 total TOK_SELEXPR TOK_FUNCTION round / TOK_FUNCTION SUM TOK_TABLE_OR_COL cost TOK_FUNCTION SUM TOK_TABLE_OR_COL uv 2 at scala.sys.package$.error(package.scala:27) at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:255) at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:75) at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:78) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:22) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29) at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31) at $iwC$$iwC$$iwC$$iwC.<init>(<console>:33) at $iwC$$iwC$$iwC.<init>(<console>:35) at $iwC$$iwC.<init>(<console>:37) at $iwC.<init>(<console>:39) -- cente...@gmail.com|齐忠 --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org