I'm actually using the web2py cron.
Is there a way I can launch the cron external process by the action,
without waiting for the 60 seconds?

On Jul 10, 3:30 pm, mdipierro <mdipie...@cs.depaul.edu> wrote:
> You can use web2py cron and trigger the processing immediately after
> the action that queues the tasks returns.
>
> Massimo
>
> On Jul 10, 8:11 am, kralin <andrea.pierle...@gmail.com> wrote:
>
> > I finally made it, at least for now.
>
> > There is a database table where all the processes are stored, both to
> > be processed and completed ones.
>
> > db.define_table('work',
> >                 SQLField('name','string'),
> >                 SQLField('status','string'),
> >                 SQLField('priority','integer'),
> >                 SQLField('input','text'),
> >                 SQLField('results','text'),
> >                 SQLField
> > ('date_submitted','datetime',default=datetime.datetime.now()),
> >                 SQLField('date_completed','datetime'),
> >                 )
>
> > than there is a cron that each minute look a the table and if there
> > are jobs to process runs the first one.
>
> > def process_job():
> >    queue=db(db.work.status=='to be done').select
> > (orderby=db.work.priority)
> >     ifqueue:
> >         job=queue[0]
> >         #do stuff like:
> >         input_string=job.input
> >         output_string='Procesed by cron worker : '+str(input_string)
> >         #submit results
> >         db.work[job.id]=dict
> > (status='completed',results=output_string,date_completed=datetime.datetime.now
> > ())
> >         db.commit()
> >     return
>
> > the problem with this is that I have to wait one minute for each table
> > lookup, and that I'm not sure it will work correctly under WSGI, since
> > I do not want to rely on system cron.
> > The fastest improvement would be to run all the process inqueueeach
> > time the controller is called by the cron,
> > one by one, but I'm still convinced that using cron is not the best
> > way to do this, even if it works.
>
> > Am I the only one with this kind of issues? If not maybe I can post
> > the "solution" to AlterEgo...
>
> > On Jul 10, 3:19 am, mdipierro <mdipie...@cs.depaul.edu> wrote:
>
> > > You can use the databae as long the objects are serializable
>
> > > On Jul 9, 6:00 pm, kralin <andrea.pierle...@gmail.com> wrote:
>
> > > > thanks massimo, very elegant...
> > > > I have to check for the lock thing, otherwise a list is more tha
> > > > sufficient for thequeue:
>
> > > >queue=cache.ram('queue',lambda:[],10**10)
> > > >queue.append(o)
> > > >queue.pop(0)
>
> > > > it should be also very easy to use a priority schedule by appending
> > > > [int(priority), task] to thequeuelist.
>
> > > > however I still need to cope with the guy that's going to do the heavy
> > > > job.
> > > > Is it possibile to set a Worker in web2py that is able to read the
> > > > input from thequeueand put the results in a database, or do I have
> > > > to launch a cron each minute to check for thequeue?
>
> > > > On 10 Lug, 00:00, mdipierro <mdipie...@cs.depaul.edu> wrote:
>
> > > > > You can do something like
>
> > > > > classQueue:
> > > > >     forever = 10**10
> > > > >     def __init__(self):
> > > > >         import thread
> > > > >         self.q=[]
> > > > >         self.lock=thread.allocate_lock()
> > > > >     def enque(self,o):
> > > > >         self.lock.acquire()
> > > > >         self.q.append(o)
> > > > >         self.lock.release()
> > > > >     def dequeue(self):
> > > > >         self.lock.acquire()
> > > > >         o = self.q[0]
> > > > >         del q[0]
> > > > >         self.lock.release()
> > > > >         return o
> > > > >     def __len__(self):
> > > > >         self.lock.acquire()
> > > > >         ell=len(self.q)
> > > > >         self.lock.release()
> > > > >         return ell
>
> > > > >queue=cache.ram('queue',lambda:Queue,Queue.forever)
> > > > >queue.enqueue('object')
> > > > > print len(queue)
> > > > > o =queue.dequeue()
>
> > > > > If you define it in a model it will be visible everywhere, including
> > > > > cron scripts. I am not sure if the lock is really necessary (because
> > > > > of the global interpreter lock) but it is safe.
>
> > > > >queue.append(
>
> > > > > On Jul 9, 4:07 pm, kralin <andrea.pierle...@gmail.com> wrote:
>
> > > > > > Hi All,
> > > > > > I'm diving into web2py that at the moment gives me the power and the
> > > > > > time do a lot of cool things. so thanks a lot to you guys.
> > > > > > I was wondering if any of you ever had the need to set aqueuefor a
> > > > > > heavy process.
> > > > > > Let's suppose I've got one heavy process that takes 2 mins to run 
> > > > > > on a
> > > > > > singol-processor machine.
> > > > > > now if 10 user submit the process at (more or less) the same time, 
> > > > > > is
> > > > > > there a way to schedule the process based on the first coming, first
> > > > > > to have response rule? each of this user should be able to see their
> > > > > > result once completed or see something like"wait, 8 jobs still in
> > > > > >queue"
>
> > > > > > I know it can be done easily with external job sheduler, but it will
> > > > > > require to use a separate process, not beeing portable, and a little
> > > > > > bit more headhache...
>
> > > > > > a quick and dirty solution wuld be to run a cron each, let's say, 2
> > > > > > mins and if a process isqueue, execute it, but I feel that can be
> > > > > > done in a much more elegant way.
>
> > > > > > what do youthink?
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"web2py Web Framework" group.
To post to this group, send email to web2py@googlegroups.com
To unsubscribe from this group, send email to 
web2py+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/web2py?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to