Hi, I am new to Hive and am trying to execute a hive script/query on AWS using java APIs (StepFactory class). My Hive query requires some arguments. I am able to start the cluster and install hive on it, but am getting error on hive execution step (Error is that AWS is unable to understand arguments).
Below is the code snippet that I have tried: StepConfig runHive = new StepConfig() .withName("Run Hive") .withActionOnFailure("TERMINATE_JOB_FLOW") /* .withHadoopJarStep( stepFactory.newRunHiveScriptStep( eventSubsetHiveScript," -d"," S3_INPUT_BUCKET="+in_bucketname+ " -d S3_OUT_BUCKET="+out_bucketname+" -d DT="+in_dt));*/ .withHadoopJarStep( stepFactory.newRunHiveScriptStep( eventSubsetHiveScript," -d"," S3_INPUT_BUCKET="+in_bucketname, " -d"," S3_OUT_BUCKET="+out_bucketname, " -d"," DT="+in_dt)); Error that I get in stderr logs is: Unrecognised option -d I have tried the the commented out snippet too, but same result :(. This time the error is: Unrecognised option -d S3_INPUT_BUCKET=s3://my-input -d S3_OUT_BUCKET=s3://my-output... Please help me in getting this correct. How to pass arguments to the hive query using Java API? Regards, Puneet Khatod | puneet.kha...@tavant.com<mailto:puneet.kha...@tavant.com> Technical Lead | T: +91 120 4030300 | F: +91 120 403 0301 Any comments or statements made in this email are not necessarily those of Tavant Technologies. The information transmitted is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you have received this in error, please contact the sender and delete the material from any computer. All e-mails sent from or to Tavant Technologies may be subject to our monitoring procedures.