Yeah, I will try to create it in a script, call from webinterface and
follow the status on a file or maybe let the script write in the database...

Thanks for your response!

Regards,

Tito

On Mon, Jul 30, 2012 at 5:54 AM, Niphlod <niph...@gmail.com> wrote:

> usually python webservers execute your functions in their own threads, and
> starting your own (threads) is always not recommended.
> For things like that you'd have to resort to an external process executing
> your code, and collecting the results somewhere where the app can fetch it
> (in an async way).
> Some queue implementations exists, you can try web2py's scheduler for the
> job (and you can probably simplify the implementation details).
>
> PS: no webapp (and probably server) would scale 1000 concurrent ssh
> connections in 1000 different threads. I see a pool of 20 and that would
> mean 50 batches to elaborate a single requirement.....that would probably
> go into timeout before returning meaningful results if executed inside a
> webserver (even if you managed to get your threads running correctly).
>
> Il giorno domenica 29 luglio 2012 23:28:48 UTC+2, Tito Garrido ha scritto:
>
>> Hi Folks!
>>
>> I'd like to create a web application that will basically send a command
>> to 1000 servers and grab some data using subprocess + ssh like:
>>
>> *p = sub.Popen(['ssh','-q', 'support@%s'%hostname,
>> command],stdout=sub.PIPE, stderr=sub.STDOUT)
>>     out = p.stdout.read()
>>     print out # it will be a db insert
>>     p.poll()
>>     rc=p.returncode
>>     print rc # it will be a db insert*
>>
>> I can't execute all threads at the same time so I would need to pool then
>> and execute using Queue module.
>>
>> My first code was:*
>> class ThreadSSH(threading.Thread):
>>     """Thread ssh command"""
>>     def __init__(self, queue, name):
>>         threading.Thread.__init__(self)
>>         self.queue = queue
>>         self.name = name
>>
>>     def run(self):
>>         #while not self.queue.empty():
>>         while True:
>>      #   print 'Starting %s' %self.name
>>         #grabs host from queue
>>             print 'Empty: %s' %self.queue.empty()
>>             print 'Qsize: %s' %self.queue.qsize()
>>             print '%s' %dir(self.queue)
>>             host = self.queue.get()
>>
>>             #core of the thread
>>             time.sleep(10)
>>             print 'Active count: %s' %threading.activeCount()
>>             print "Exiting " + self.name
>>
>>             #signals to queue job is done
>>             self.queue.task_done()
>>
>> def test():
>>     hosts = ['host1', 'host2', 'host3', 'host4']
>>     # Queue of servers, we will fill it down there
>>     queue = Queue.Queue()
>>     print 'begin: %s' %threading.enumerate()
>>     #spawn a pool of threads, and pass them queue instance
>>     for i in range(20):
>>         t = ThreadSSH(queue, 'test-%s' %i)
>>         t.setDaemon(True)
>>         t.start()
>>     print 'thread generated: %s' %threading.enumerate()
>>
>>     #populate queue with data
>>     for host in hosts:
>>         queue.put(host)
>>         print 'SIZE: %s' %queue.qsize()*
>>
>>
>> Not sure if this is the right approach for this kind of application but
>> the problem is that those threads never ends... if I put in another
>> function:
>> *def threadInfo():
>>     output='enumerate: %s<br/>' %['%s Live(%s) Daemon(%s)' %(t.name
>> ,t.isAlive(),t.isDaemon()) for t in threading.enumerate()]
>>     output+='active count: %s<br/>' %threading.activeCount()
>>     output+='current thread: %s<br/>' %threading.currentThread().name
>>     return output*
>>
>> I can see that even the thread ending it is showing on enumerate.
>>
>> Do you guys have another idea or know how to fix this issue?
>>
>> Thanks in advance,
>>
>> Tito
>>
>> --
>>
>> Linux User #387870
>> .........____
>> .... _/_õ|__|
>> ..º[ .-.___.-._| . . . .
>> .__( o)__( o).:_______
>>
>  --
>
>
>
>



-- 

Linux User #387870
.........____
.... _/_õ|__|
..º[ .-.___.-._| . . . .
.__( o)__( o).:_______

-- 



Reply via email to