Some more info:
I'm putting the compressions values on hive-site.xml and running spark job.
hc.sql("set ") returns the expected (compression) configuration but
looking at the logs, it create the tables without compression:
15/04/21 13:14:19 INFO metastore.HiveMetaStore: 0: create_table:
Table(t
Sadly I'm encounter too many issues migrating my code to Spark 1.3
I wrote one problem on other mail but my main problem is that I can't set
the right compression type.
In Spark 1.2.1 setting the following values was enough:
hc.setConf("hive.exec.compress.output", "true")
hc.setConf("mapreduce