Hello!

I have a question regarding Hive and Spark.

As far as I know, in order to use Hive-on-Spark one need to compile Spark
without Hive profile, but that means that it won't be possible to access
Hive from normal Spark jobs.

How is community going to address this issue? Making two different
spark-assembly jars or something else?


Thanks,
Rostyslav

Reply via email to