I'd suggest using org.apache.spark.sql.hive.test.TestHive as the context in
unit tests. It takes care of creating separate directories for each
invocation automatically.
On Wed, Jul 29, 2015 at 7:02 PM, JaeSung Jun wrote:
> Hi,
> I'm working on custom sql processing on top of Spark-SQL, and i'm
Hi,
I'm working on custom sql processing on top of Spark-SQL, and i'm upgrading
it along with spark 1.4.1.
I've got an error regarding multiple test suites access hive meta store at
the same time like :
Cause: org.apache.derby.impl.jdbc.EmbedSQLException: Another instance of
Derby may have alread