Hi Fawze, actually, the <jar/> element of the <spark/> action can contain a comma-separated list of JAR or ZIP files. So you actually already can add more than one JAR files to the Spark application.
Please see documentation on *Spark action details <https://oozie.apache.org/docs/5.0.0-beta1/DG_SparkActionExtension.html>*. Regards, Andras On Sun, Mar 25, 2018 at 8:20 PM, Fawze Abujaber <fawz...@gmail.com> wrote: > Hi All, > > I'm using oozie action workflow for spark job and i want to add to my oozie > workflow --jars options, i'm aware it's not a straightforward solution. > > Here is my workflow and i want to add --jars to the spark-opts > > <action name='aggregation_group_3'> > <spark xmlns="uri:oozie:spark-action:0.1"> > <job-tracker>${jobTracker}</job-tracker> > <name-node>${nameNode}</name-node> > <configuration> > <property> > <name>mapred.job.queue.name</name> > <value>${queue}</value> > </property> > </configuration> > <master>${master}</master> > <mode>${mode}</mode> > <name>aggregation_group_3</name> > <class>xxxxxx</class> > > <jar>hdfs:///tmp/${jarVersion}/aggregator-code-${jarVersion} > -jar-with-dependencies.jar</jar> > <spark-opts>--driver-memory ${driver_memory} --num-executors > ${num_executors_3} --executor-cores ${executor_cores_3} --executor-memory > ${executor_memory_3} --queue ${queue} ${files} --driver-java-options > -Dcp.days.lookback=${days_lookback}</spark-opts> > <arg>${applicationConf}</arg> > > <arg>hdfs://tmp/conf/rpt_agg_15m_skill_camp_distinct.yaml, > hdfs://tmp/conf/rpt_agg_15m_skill_distinct.yaml,hdfs:// > tmp/conf/rpt_agg_15m_lob_distinct.yaml</arg> > <arg>xxxx</arg> > </spark> > <ok to="xxxxx"/> > <error to="fail-job"/> > </action> >