Hello,

I am considering Aurora for a key component of our infrastructure.
Awesome work being done here.

My question is: How suitable is Aurora for running short-lived tasks?

Background: We (Code Climate) do static analysis of tens of thousands
of repositories every day. We run a variety of forms of analysis, with
heterogeneous resource requirements, and thus our interest in Mesos.

Looking at Aurora, a lot of the core features look very helpful to us.
Where I am getting hung up is figuring out how to model short-lived
tasks as tasks/jobs. Long-running resource allocations are not really
an option for us due to the variation in our workloads.

My first thought was to create a Task for each type of analysis we
run, and then start a new Job with the appropriate Task every time we
want to run analysis (regulated by a queue). This doesn't seem to work
though. I can't `aurora create` the same `.aurora` file multiple times
with different Job names (as far as I can tell). Also there is the
problem of how to customize each Job slightly (e.g. a payload).

An obvious alternative is to create a unique Task every time we want
to run work. This would result in tens of thousands of tasks being
created every day, and from what I can tell Aurora does not intend to
be used like that. (Please correct me if I am wrong.)

Basically, I would like to hook my job queue up to Aurora to perform
the actual work. There are a dozen different types of jobs, each with
different performance requirements. Every time a job runs, it has a
unique payload containing the definition of the work it should be
performed.

Can Aurora be used this way? If so, what is the proper way to model
this with respect to Jobs and Tasks?

Any/all help is appreciated.

Thanks!

-Bryan

-- 
Bryan Helmkamp, Founder, Code Climate
br...@codeclimate.com / 646-379-1810 / @brynary

Reply via email to