Could anyone give a working example using asynchat in Python3.0?

2008-10-21 Thread davy zhang
I tried to use them but the terminator option seems no effect

I search the google but all the code is Python2.x format, I modified
them but I get no luck.

thanks for any advice
--
http://mail.python.org/mailman/listinfo/python-list


why this server can not run in python30

2008-10-21 Thread davy zhang
import asyncore, asynchat
import os, socket, string

PORT = 8000

class HTTPChannel(asynchat.async_chat):

def __init__(self, server, sock, addr):
asynchat.async_chat.__init__(self, sock)
self.set_terminator("\r\n")
self.request = None
self.data = ""
self.shutdown = 0

def collect_incoming_data(self, data):
self.data = self.data + data

def found_terminator(self):
if not self.request:
# got the request line
self.request = string.split(self.data, None, 2)
if len(self.request) != 3:
self.shutdown = 1
else:
self.push("HTTP/1.0 200 OK\r\n")
self.push("Content-type: text/html\r\n")
self.push("\r\n")
self.data = self.data + "\r\n"
self.set_terminator("\r\n\r\n") # look for end of headers
else:
# return payload.
self.push("\r\n")
self.push(self.data)
self.push("\r\n")
self.close_when_done()

class HTTPServer(asyncore.dispatcher):

def __init__(self, port):
asyncore.dispatcher.__init__(self)
self.create_socket(socket.AF_INET, socket.SOCK_STREAM)
self.bind(("", port))
self.listen(5)

def handle_accept(self):
conn, addr = self.accept()
HTTPChannel(self, conn, addr)

#
# try it out

s = HTTPServer(PORT)
print ("serving at port", PORT, "...")
asyncore.loop()
--
http://mail.python.org/mailman/listinfo/python-list


Re: why this server can not run in python30

2008-10-22 Thread davy zhang
thanks so much for fixing the problems!!


On Wed, Oct 22, 2008 at 2:56 PM, Gabriel Genellina
<[EMAIL PROTECTED]> wrote:
> En Wed, 22 Oct 2008 03:45:31 -0200, davy zhang <[EMAIL PROTECTED]>
> escribió:
>
>> import asyncore, asynchat
>> import os, socket, string
>>
>> PORT = 8000
>>
>> class HTTPChannel(asynchat.async_chat):
>>
>>def __init__(self, server, sock, addr):
>>asynchat.async_chat.__init__(self, sock)
>>self.set_terminator("\r\n")
>
> self.set_terminator(b"\r\n")
>
>>self.request = None
>>self.data = ""
>
> self.data = b""
>
> Same for all remaining string literals, should be bytes instead.
>
>>self.request = string.split(self.data, None, 2)
>
> The string module functions are deprecated ages ago in favor of the
> corresponding string (instance) methods:
>
> self.request = self.data.split(None, 2)
>
> That's enough - the example worked fine for me after doing these changes.
>
> --
> Gabriel Genellina
>
> --
> http://mail.python.org/mailman/listinfo/python-list
>
--
http://mail.python.org/mailman/listinfo/python-list


new to python network programming is async_chat.push thread-safe? python3.0

2008-10-23 Thread davy zhang
I wrote this server to handle incoming messages in a process using
multiprocessing named "handler", and sending message in a Thread named
"sender",  'cause I think the async_chat object can not pass between
processes.

My project is a network gate server with many complex logic handler
behind, so I use multiprocessing to handle them separately and send
back the clients later when done.
To use the server multicore cpu I tried to separate the send and
receive function in different process but it seems can not be done :)

I just get questions about this design:
1. is async_chat.push thread-safe? 'Cause I found random errors
reporting push fifo queue out of index 0 sometimes
2. is the whole design odd in any way?


here is my code

import asyncore, asynchat
import os, socket, string
from multiprocessing import Process,Manager
import pickle
import _thread

PORT = 80

policyRequest = b""
policyReturn = b"""
 
  \x00"""

def handler(taskList,msgList):
while 1:
print('getting task')
item = pickle.loads(taskList.get())
print('item before handle ', item)
item['msg'] += b' hanlded done'
msgList.put(pickle.dumps(item))

def findClient(id):
for item in clients:
if item.idx == id:
return item

def sender():
global msgList
while 1:
item = pickle.loads(msgList.get())
#print time()
c = findClient(item['cid'])
#print time()
c.push(item['msg'])
print('msg sent ',item['msg'])
#print time()

class HTTPChannel(asynchat.async_chat):

def __init__(self, server, sock, addr):
global cid;
asynchat.async_chat.__init__(self, sock)
self.set_terminator(b"\x00")
self.data = b""
cid += 1
self.idx = cid
if not self in clients:
clients.append(self)

def collect_incoming_data(self, data):
self.data = self.data + data
print(data)

def found_terminator(self):
global taskList
print("found",self.data)
if self.data == policyRequest:
self.push(policyReturn)
else:
d = {'cid':self.idx,'msg':self.data}
taskList.put(pickle.dumps(d))
self.data = b""

def handle_close(self):
if self in clients:
clients.remove(self)

class HTTPServer(asyncore.dispatcher):

def __init__(self, port):
asyncore.dispatcher.__init__(self)
self.create_socket(socket.AF_INET, socket.SOCK_STREAM)
self.bind(("", port))
self.listen(5)

def handle_accept(self):
conn, addr = self.accept()
HTTPChannel(self, conn, addr)


#
# try it out
if __name__ == "__main__":
s = HTTPServer(PORT)
print ("serving at port", PORT, "...")

#clients sock obj list stored for further use
clients=[]

#client id auto increasement
cid = 0

manager = Manager()
taskList = manager.Queue()
msgList = manager.Queue()


h = Process(target=handler,args=(taskList,msgList))
h.start()


_thread.start_new_thread(sender,())
print('entering loop')

asyncore.loop()
--
http://mail.python.org/mailman/listinfo/python-list


Re: Will Python 3 be "stackless"?

2008-10-23 Thread davy zhang
multiprocessing is good enough for now,

On Fri, Oct 24, 2008 at 4:30 AM, Diez B. Roggisch <[EMAIL PROTECTED]> wrote:
> Phillip B Oldham schrieb:
>>
>> On Thu, Oct 23, 2008 at 9:20 PM, Chris Rebert <[EMAIL PROTECTED]> wrote:
>>>
>>> No, it will definitely not.
>>
>>> From your statement (and I'm terribly sorry if I've taken it out of
>>
>> context) it would seem that such features are frowned-upon. Is this
>> correct? And if so, why?
>
> You got the wrong impression. It's not frowned upon. It just is a lot of
> extra effort to implemnt & thus makes the development of "normal" features
> more complex.
>
> Diez
> --
> http://mail.python.org/mailman/listinfo/python-list
>
--
http://mail.python.org/mailman/listinfo/python-list


why asynchat's initiate_send() get called twice after reconnect ?

2008-10-25 Thread davy zhang
Python3.0rc1  windowsxp

in the lib\asynchat.py

   def handle_write (self):
   self.initiate_send()

   def push (self, data):
   sabs = self.ac_out_buffer_size
   if len(data) > sabs:
   for i in range(0, len(data), sabs):
   self.producer_fifo.append(data[i:i+sabs])
   else:
   self.producer_fifo.append(data)
   self.initiate_send()

when there's only one time connection, the object works just fine. but
problems came out when the client disconnected and reconnected again
to the server, it seems there are two ways to call the initiate_send,
one is from push() which I called in my program, one is from
handle_write() which automatically called in asyncore.loop(). I just
can't get it why one time connection works fine but multi-time
connection went bad.

I printed the traceback. I found when one time connection made, the
handle_write() always get silent, but when the second time, it get
called and start to call initiate_send in the same time as push()  get
called. So confusing



So I tried to remove the initiate_send from push() and the code
magically works fine for me.

the main program lists below:
since it's need a flash client, I attached a webpage to reproduce the problem
click on the connect button multiple times and clicked on the send
button will make an error

import asyncore, asynchat
import os, socket, string
from multiprocessing import Process,Manager
import pickle
import _thread
import threading

PORT = 80

policyRequest = b""
policyReturn = b"""

 \x00"""

def handler(taskList,msgList):
   while 1:
   print('getting task')
   item = pickle.loads(taskList.get())
   print('item before handle ', item)
   #do something
   item['msg'] += b' hanlded done'
   msgList.put(pickle.dumps(item))

def findClient(id):
   for item in clients:
   if item.idx == id:
   return item

def pushData(ch,data):
   global pushLock
   pushLock.acquire()
   try:
   ch.push(data)
   finally:
   pushLock.release()


def sender():
   global msgList
   print('thread started')
   while 1:
   item = pickle.loads(msgList.get())
   #print time()
   c = findClient(item['cid'])
   #print time()
   #wrong here it's not thread safe, need some wrapper
   #c.push(item['msg'])
   pushData(c,item['msg'])
   print('msg sent ',item['msg'])
   #print time()

class HTTPChannel(asynchat.async_chat):

   def __init__(self, server, sock, addr):
   global cid;
   asynchat.async_chat.__init__(self, sock)
   self.set_terminator(b"\x00")
   self.data = b""
   cid += 1
   self.idx = cid
   if not self in clients:
   print('add to clients:',self)
   clients.append(self)

   def collect_incoming_data(self, data):
   self.data = self.data + data
   print(data)

   def found_terminator(self):
   global taskList
   print("found",self.data)
   if self.data == policyRequest:
   pushData(self,policyReturn)
   self.close_when_done()
   else:
   d = {'cid':self.idx,'msg':self.data}
   taskList.put(pickle.dumps(d))
   self.data = b""

   def handle_close(self):
   if self in clients:
   print('remove from clients:',self)
   clients.remove(self)

class HTTPServer(asyncore.dispatcher):

   def __init__(self, port):
   asyncore.dispatcher.__init__(self)
   self.create_socket(socket.AF_INET, socket.SOCK_STREAM)
   self.bind(("", port))
   self.listen(5)

   def handle_accept(self):
   conn, addr = self.accept()
   print('a new customer!')
   HTTPChannel(self, conn, addr)


#
# try it out
if __name__ == "__main__":
   s = HTTPServer(PORT)
   print ("serving at port", PORT, "...")

   #push data lock
   pushLock = threading.Lock()


   clients=[]

   cid = 0

   manager = Manager()

   taskList = manager.Queue()

   msgList = manager.Queue()


   h = Process(target=handler,args=(taskList,msgList))
   h.start()


   _thread.start_new_thread(sender,())
   print('entering loop')
   asyncore.loop()
--
http://mail.python.org/mailman/listinfo/python-list


how to use logging module to log an object like print()

2008-10-29 Thread davy zhang
mport logging
import pickle


# create logger
logger = logging.getLogger("simple_example")
logger.setLevel(logging.DEBUG)
# create console handler and set level to debug
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
# create formatter
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s
- %(message)s ")
# add formatter to ch
ch.setFormatter(formatter)
# add ch to logger
logger.addHandler(ch)

d = {'key':'msg','key2':'msg2'}

# "application" code
logger.debug("debug message",d)#can not do this
logger.info("info message")
logger.warn("warn message")
logger.error("error message")
logger.critical("critical message")
--
http://mail.python.org/mailman/listinfo/python-list


Re: how to use logging module to log an object like print()

2008-10-29 Thread davy zhang
thanks so much , I ganna check the formatter str for more info:)

On Wed, Oct 29, 2008 at 5:10 PM, Diez B. Roggisch <[EMAIL PROTECTED]> wrote:
> davy zhang schrieb:
>>
>> mport logging
>> import pickle
>>
>>
>> # create logger
>> logger = logging.getLogger("simple_example")
>> logger.setLevel(logging.DEBUG)
>> # create console handler and set level to debug
>> ch = logging.StreamHandler()
>> ch.setLevel(logging.DEBUG)
>> # create formatter
>> formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s
>> - %(message)s ")
>> # add formatter to ch
>> ch.setFormatter(formatter)
>> # add ch to logger
>> logger.addHandler(ch)
>>
>> d = {'key':'msg','key2':'msg2'}
>>
>> # "application" code
>> logger.debug("debug message",d)#can not do this
>
> logger.debug("yes you can: %r", d)
>
>
> Diez
> --
> http://mail.python.org/mailman/listinfo/python-list
>
--
http://mail.python.org/mailman/listinfo/python-list


is there a way to access postgresql in python3.0rc1

2008-10-29 Thread davy zhang
I'm currently on a  project, it could last for at least 1 or 2 years.
so I choose python3 as server side programing language.
All I found on are python2.x ready libraries, Is there any library is
python3.0 ready? or just under alpha ,beta or something, I don't much
features, just basic functions are OK

Thanks for any hint~
--
http://mail.python.org/mailman/listinfo/python-list


Re: is there a way to access postgresql in python3.0rc1

2008-10-29 Thread davy zhang
thanks, I'll wait a month and see, in the mean time I can use 2.x for
my prototyping, hope python3.0 final can drop a nuke on the ground :D

On Thu, Oct 30, 2008 at 12:31 PM, Steve Holden <[EMAIL PROTECTED]> wrote:
> Terry Reedy wrote:
>> davy zhang wrote:
>>> I'm currently on a  project, it could last for at least 1 or 2 years.
>>> so I choose python3 as server side programing language.
>>> All I found on are python2.x ready libraries, Is there any library is
>>> python3.0 ready? or just under alpha ,beta or something, I don't much
>>> features, just basic functions are OK
>>
>> Python3.0 final should be released in about a month.  Extension
>> libraries will start appearing after that.  You will have to ask the
>> maintainers of a particular library what their plans are for Python 3.
>> Some will port very easily and could be available soon.  Others will
>> take more work and may not appear so soon.
>>
> Please note, however, that Python 3.x is not likely to be as
> well-supported by extension modules as Python 2.x for some considerable
> time. The OP may therefore wish to reconsider his choice of Python 3.0.
>
> regards
>  Steve
> --
> Steve Holden+1 571 484 6266   +1 800 494 3119
> Holden Web LLC  http://www.holdenweb.com/
>
> --
> http://mail.python.org/mailman/listinfo/python-list
>
--
http://mail.python.org/mailman/listinfo/python-list


Is there a way to step debug the multiprocessing python program?

2008-11-07 Thread davy zhang
I mean every process attach like thread in wingide

like thread or tasklet in wingide

:)

maybe I asked t much:D
--
http://mail.python.org/mailman/listinfo/python-list


concurrency program design stackless python tasklet or python thread?

2008-11-10 Thread davy zhang
first here is my basic idea is every actor holds their own msg queue,
the process function will handle the message as soon as the dispatcher
object put the message in.

This idea naturally leads me to place every actor in a separate thread
waiting for msg

but the rumor has it, stackless python with tasklet and channel can do
much more better in concurrency program, so I dive my head into it.

but I found the tasklet is really a lined-up sequence , that means if
a tasklet blocked or do some time consuming calculation, the other
tasklets can not get the cpu slice

so we must design very carefully to avoid the big job for single task

I am just confused why the stackless python is said to be good at
concurrency program model or just I get a wrong idea to practice?
--
http://mail.python.org/mailman/listinfo/python-list


Re: concurrency program design stackless python tasklet or python thread?

2008-11-10 Thread davy zhang
thanks very much for the hint,  circuits is a very good event-driven
frame work just like twisted

but currently my project is in a pretty much complex way

see, I'm designing a so called "Game Server", every client has their
own task execution order, see like below:

1.clientA wants to sale his armor
2.clientA wants to buy the knife
3.clientA wants to talk to npc

I think for one client this sequence should be lined-up, the task
should follow the 1,2,3step

but there's multiple clients on this server so for multiple clients
they can not wait until clientA finishing his job. if there's clientB
like below:

1.clientB wants to say something to npc
2.clientB wants to attack clientC
3.clientB wants to draw his sword

the whole process for the server should be like this sequence:


1.clientA wants to sale his armor
2.clientB wants to say something to npc
3.clientB wants to attack clientC
4.clientA wants to buy the knife
5.clientA wants to talk to npc
6.clientB wants to draw his sword

for clientA and clientB separately their tasks are lined-up, for whole
system they are concurrent

and plus I don't want to block the whole task system when a single
client dealing with big chunk task.

I don't know if I get picture right:)

any idea about my design? thanks a lot
--
http://mail.python.org/mailman/listinfo/python-list


Re: concurrency program design stackless python tasklet or python thread?

2008-11-12 Thread davy zhang
thank you very much for the advices!

I asked myself many times, why not just use thread:D

After some research I found thread has some fatal defects

1. thread number is limited by os, that means the system don't want
you start many threads at the same time
2. thread pool is another approach for concurrent program, but the
context switching could be very costy

so here comes stackless way?


On Wed, Nov 12, 2008 at 12:10 AM, Aleksandar Radulovic <[EMAIL PROTECTED]> 
wrote:
> Hi there,
>
> On Tue, Nov 11, 2008 at 5:57 AM, davy zhang <[EMAIL PROTECTED]> wrote:
>> first here is my basic idea is every actor holds their own msg queue,
>> the process function will handle the message as soon as the dispatcher
>> object put the message in.
>
> Using stackless, every tasklet can have a channel which it uses to communicate
> with other tasklets. The tasklet is blocked until there's something on
> the channel
> to receive.
>
>> This idea naturally leads me to place every actor in a separate thread
>> waiting for msg
>
> You can have actors with many separate tasklets waiting for messages, still
> being relatively lightweight, meaning you can run thousands of tasklets 
> without
> serious lack of performance.
>
>> but I found the tasklet is really a lined-up sequence , that means if
>> a tasklet blocked or do some time consuming calculation, the other
>> tasklets can not get the cpu slice
>
> This is cooperative scheduling, which you can choose not to use with Stackless
> (instead, use preemptive scheduling). If you determine that one particular
> task is taking too much cpu, you can declaratively call stackless.schedule()
> and put that task back to the scheduler queue and allow other tasks to
> have a go.
>
>> so we must design very carefully to avoid the big job for single task
>
> That's right - with cooperative scheduling careful design is the key.
>
>> I am just confused why the stackless python is said to be good at
>> concurrency program model or just I get a wrong idea to practice?
>
> Stackless is an extremely lightweight way into concurrent programming.
> I have personally used it in few projects and i quite like how lightweight
> it is and how easy it is to write concurrent programs.
>
> On the plus side, Stackless developers have plenty of examples and
> common idioms of how Stackless should be used, which I highly recommend
> you to check out. You might find a solution to your problem right there
> amongst the examples.
>
>
> Check it all out on http://www.stackless.com
>
> --
> a lex 13 x
> http://www.a13x.info
> --
> http://mail.python.org/mailman/listinfo/python-list
>
--
http://mail.python.org/mailman/listinfo/python-list