Hi All

I need to be able to create, submit and report on Spark jobs
programmatically in response to events arriving on a Kafka bus. I also need
end-users to be able to create data queries that launch Spark jobs 'behind
the scenes'.

I would expect to use the same API for both, and be able to provide a user
friendly view (ie. *not *the Spark web UI) of all jobs (user and system)
that are currently running, have completed, failed etc.

Are there any tools / add-ons for this? Or is there a suggested approach?

Thanks

Reply via email to