Hello,
I am using Spark1.3 in AWS.
SparkSQL can't recognize Hive external table on S3. 
The following is the error message. 
I appreciate any help.
Thanks,
Okehee
------  
15/05/24 01:02:18 ERROR thriftserver.SparkSQLDriver: Failed in [select
count(*) from api_search where pdate='2015-05-08']
java.lang.IllegalArgumentException: Wrong FS:
s3://test-emr/datawarehouse/api_s3_perf/api_search/pdate=2015-05-08/phour=00,
expected: hdfs://10.128.193.211:9000
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:647)
        at org.apache.hadoop.fs.FileSystem.makeQualified(FileSystem.java:467)
        at
org.apache.spark.sql.parquet.ParquetRelation2$MetadataCache$$anonfun$6.apply(newParquet.scala:252)
        at
org.apache.spark.sql.parquet.ParquetRelation2$MetadataCache$$anonfun$6.apply(newParquet.scala:251)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-can-t-read-S3-path-for-hive-external-table-tp23002.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to