Hi All,

How do you create an external Druid table via Spark?

I know that you can do it like this:
https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.1.0/using-druid/content/druid_anatomy_of_hive_to_druid.html

But the issue is that Spark was built on Hive 1.2.1:
https://spark.apache.org/docs/latest/sql-distributed-sql-engine.html

That version of hive didn't have support for Druid ingestion. So, running
the ingestion query shows me error.

What is the best solution for this?

Thanks,
Val

Reply via email to