I submit 5 queries from 5 different scripts.  When I use hive cli, I could see 
all these 5 queries related jobs in application tracker in different states ( 
running/accepted etc). When I submit through beeline only one job I see in 
application tracker. Other job I can see only after first job finishes.

Ex: hive –f script1.hql &
      Hive –f script2.hql &
      Hive –f script3.hql &
      Hive –f script4.hql &
      Hive –f script5.hql &


From: Xuefu Zhang [mailto:xzh...@cloudera.com]
Sent: Friday, July 11, 2014 7:29 PM
To: user@hive.apache.org
Subject: Re: beeline client

Chaudra,
The difference you saw between Hive CLI and Beeline might indicate a bug. 
However, before making such a conclusion, could you give an example of your 
queries? Are the jobs you expect to run parallel for a single query? Please 
note that your script file is executed line by line in either case.
--Xuefu

On Thu, Jul 10, 2014 at 11:55 PM, Bogala, Chandra Reddy 
<chandra.bog...@gs.com<mailto:chandra.bog...@gs.com>> wrote:
Hi,
   Currently I am submitting multiple hive jobs using hive cli with “hive –f” 
from different scripts. All these jobs I could see in application tracker and 
these get processed in parallel.
Now I planned to switch to HiveServer2 and submitting jobs using beeline client 
from multiple scripts  example : “nohup beeline -u 
jdbc:hive2://<host>:<port>#<variable>=$epoch -n <user> -p <pass> -d 
org.apache.hive.jdbc.HiveDriver -f <hql script>.hql &”.
But I could see jobs get submitted in serial. Only one job appears in 
application tracker at a time even though cluster resources are available. Once 
that job finishes then only other job get submitted. Why is that? Is there any 
setting needs to be set to submit jobs in parallel?

Thanks,
Chandra


Reply via email to