Ok, I managed to solve it. As the issue in jira suggests it fixed in 1.2.1, i probably had some old jars in the classpath. Cleaning everything and rebuild eventually solve the problem. On Mar 17, 2015 12:25 PM, "Ophir Cohen" <oph...@gmail.com> wrote:
> Hi Guys and great job! > I encounter a weird problem on local mode and I'll be glad to solve it > out... > When trying to save ScehmaRDD into Hive table it fails with > 'TreeNodeException: Unresolved plan found' > I have found similar issue in Jira: > https://issues.apache.org/jira/browse/SPARK-4825 but I'm using Spark > 1.2.1 and I get the same error. > In cluster mode it works as it should but failed in local mode. > The code I'm using: > > > > *val hc = new HiveContext(new SparkContext(new > SparkConf().setMaster("local[*]").setAppName("test-app")))val file = > hc.parquetFile("<path to my file>")file.saveAsTable("my_table_name")* > > And I get the following error: > > > > > > > > *An exception or error caused a run to abort: Unresolved plan found, > tree:'CreateTableAsSelect None, dailyprice, false, None ParquetRelation > /home/ophchu/opr/repos/opr-spark/src/test/resources/aapl/derived/splits_divs/reuters/split_adj.pq/part-r-1.parquet, > Some(Configuration: core-default.xml, core-site.xml, mapred-default.xml, > mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, > hdfs-site.xml), org.apache.spark.sql.hive.HiveContext@a02632b, [] > org.apache.spark.sql.catalyst.errors.package$TreeNodeException: Unresolved > plan found, tree:'CreateTableAsSelect None, dailyprice, false, > None ParquetRelation > /home/ophchu/opr/repos/opr-spark/src/test/resources/aapl/derived/splits_divs/reuters/split_adj.pq/part-r-1.parquet, > Some(Configuration: core-default.xml, core-site.xml, mapred-default.xml, > mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, > hdfs-site.xml), org.apache.spark.sql.hive.HiveContext@a02632b, []* > > Again, its happened only when running on local mode. > Thanks! > Ophir >