I was facing this problem myself recently. I am running Django 1.0.2 with
Python 2.5.1 on Webfaction... I have a Gallery app that lets you upload
photos through a web form, or else you can dump photos into the gallery
directory with scp and the gallery auto-discovers them and builds model
instances for them. As part of this process, an array of scaled photos is
created and deposited in various directories with PIL.

The normal way I use my app is I create the gallery, dump in all the photos
I want with scp (way faster than the web form, I only included that option
for completeness), then I visit the gallery to build the data structures and
thumbnails, claim myself as the poster for all unclaimed photos, and fill in
descriptions. The problem is that the image scaling can be lengthy for a lot
of photos, and I don't want the thing to hang forever, or time out a couple
of times until it's done. I would like a background process to do it.

I looked at the links suggested... I like django-chronograph, but I'm not
sure if I want to handle it through cron jobs... what I envisioned was a
subprocess or stand-alone server process that gets fed the jobs to make the
scaled images, while the database part of the job gets done nice and quick
with the user seeing a fast response. My thumbnail template tag is smart
enough to fill in a "question mark" thumbnail for contexts where an image
isn't ready yet, so that's ok. Django Queue Service looks cool, but I have a
wee aversion to interprocess communication by http.

I would LOVE to build something using the new multiprocessing module in
Python 2.6, but I am not sure I want to fiddle with the backport for 2.5 :(.
I finally settled on something horrible and backwards but it works pretty
well :).

I have a thumbmaker app which is a python script living under my gallery app
in its own directory anlong with an empty lock file. It has a subdirectory
called tasks. When a Picture is created, the post_save signal calls the
create_thumbs method of the model, which writes a file to the tasks
directory containing the path to the photo. The filename is a number, it
writes the highest consecutive number available in the directory. A
middleware checks on every response if there are tasks, and tries to get a
nonblocking exclusive lock on the lock file with flock. If it succeeds, it
relinquishes the lock and uses Popen to start the thumbmaker script, with
the django app's sys.path copied into its environment. If it fails it just
moves on. When the thumbmaker starts, it takes a lock on the lock file and
doesn't relinquish it until there are no more tasks. It takes each task,
lowest file number first, and reads and deletes the file.

So in other words, I have a very crude file-based queue, with primitive
locking to make sure only one worker process is spawned (would not do to go
over my system resource limits) and middleware to fire it up. I feel a
little dirty lol, but I have to say it works.

Stephen

On Wed, Mar 11, 2009 at 7:43 PM, Mad Sweeney <madswee...@eircom.net> wrote:

>
> Hello,
>
> Is there an existing Django app that does:
> * Allows users to run predefined jobs (shell scripts, SQL etc.) in the
> background.
> * Allows users to view the status of jobs: (Running, Scheduled, Completed).
> * Allows users to view the output from jobs (spreadsheets, reports, log
> files).
>
> Thanks for any pointers
>
>
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to