I wrote a little script that acts like a proxy, you just give it a URL and it 
will fetch the content and display it back to you. 

For some reason, this proxy blocks sometimes and refuses to serve any new 
queries. The script still runs, but it seems like it's stuck somewhere.

When I strace it to see what it's doing, I find it hanging on this instruction 
: 
root@backup[10.10.10.21] ~/SCRIPTS/INFOMANIAK # strace -fp 6918
Process 6918 attached - interrupt to quit
recvfrom(6,
^CProcess 6918 detached
root@backup[10.10.10.21] ~/SCRIPTS/INFOMANIAK # 

I read in the SimpleHTTPServer source code that one can inherit from the 
SocketServer.TrheadingMixIn mixin to enable a threaded server to handle 
multiple requests at a time instead of just one (thinking maybe that's what was 
blocking it). However, it seems like it has nothing to do with my problem. What 
I need to do is not only handle multiple requests at a time, but more 
importantly to make the request handler non-blocking.

Any ideas ? here's come code : 

import SimpleHTTPServer
import BaseHTTPServer
import SocketServer
import requests

class 
Handler(SocketServer.ThreadingMixIn,SimpleHTTPServer.SimpleHTTPRequestHandler):
    def do_GET(self):
        self.send_response(200)
        self.send_header('Content-Type', 'text/html')
        self.end_headers()
        # self.path will contain a URL to be fetched by my proxy
        self.wfile.write(getFlux(self.path.lstrip("/")))

session = requests.Session()
IP,PORT = "MY_IP_HERE",8080

def getFlux(url):
    response  = session.get(url)
    s = response.text
    return s

server = BaseHTTPServer.HTTPServer((IP,PORT),Handler)
server.serve_forever()

Thank you.
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to