Hi, Sebastian,
To use private APIs, you have to be very familiar with the code path;
otherwise, it is very easy to hit an exception or a bug.
My suggestion is to use IntelliJ to step-by-step step in the
function hiveContext.sql until you hit the parseSql API. Then, you will
know if you have to ca
What we're trying to achieve is a fast way of testing the validity of our
SQL queries within Unit tests without going through the time consuming task
of setting up an Hive Test Context.
If there is any way to speed this step up, any help would be appreciated.
Thanks,
Sebastian
*Sebastian Nadorp*
Just curious why you are using parseSql APIs?
It works well if you use the external APIs. For example, in your case:
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
hiveContext.sql("CREATE EXTERNAL TABLE IF NOT EXISTS `t`(`id` STRING, `foo`
INT) PARTITIONED BY (year INT, month INT
Thats not really intended to be a public API as there is some internal
setup that needs to be done for Hive to work. Have you created a
HiveContext in the same thread? Is there more to that stacktrace?
On Tue, Oct 20, 2015 at 2:25 AM, Ayoub wrote:
> Hello,
>
> when upgrading to spark 1.5.1 fro