How about this:
1. You create a primary key in your custom system.
2. Schedule the job with custom primary name as the job name.
3. After setting up spark context (inside the job) get the application id.
Then save the mapping of App Name & AppId from spark job to your custom
database, through some
Default implementation is to add milliseconds. For mesos it is
framework-id. If you are using mesos, you can assume that your framework id
used to register your app is same as app-id.
As you said, you have a system application to schedule spark jobs, you can
keep track of framework-ids submitted by
Currently Spark sets current time in Milliseconds as the app Id. Is there a
way one can pass in the app id to the spark job, so that it uses this
provided app id instead of generating one using time?
Lets take the following scenario : I have a system application which
schedules spark jobs, and rec