Re: hive on spark job not start enough executors

2016-09-09 Thread 明浩 冯
ble from the parquet. Thanks, Minghao Feng From: Mich Talebzadeh Sent: Friday, September 9, 2016 4:49:55 PM To: user Subject: Re: hive on spark job not start enough executors when you start hive on spark do you set any parameters for the submitted job (or read them f

Re: hive on spark job not start enough executors

2016-09-09 Thread Mich Talebzadeh
when you start hive on spark do you set any parameters for the submitted job (or read them from init file)? set spark.master=yarn; set spark.deploy.mode=client; set spark.executor.memory=3g; set spark.driver.memory=3g; set spark.executor.instances=2; set spark.ui.port=; Dr Mich Talebzadeh

hive on spark job not start enough executors

2016-09-09 Thread ?? ?
Hi there, I encountered a problem that makes hive on spark with a very low performance. I'm using spark 1.6.2 and hive 2.1.0, I specified spark.shuffle.service.enabledtrue spark.dynamicAllocation.enabled true in my spark-default.conf file (the file is in both spark and hive conf