Andres Riancho added the comment:
FYI, I'm using Python 2.7.6
--
___
Python tracker
<http://bugs.python.org/issue826897>
___
___
Python-bugs-list m
Changes by Andres Riancho :
--
type: -> behavior
___
Python tracker
<http://bugs.python.org/issue826897>
___
___
Python-bugs-list mailing list
Unsubscrib
Changes by Andres Riancho :
--
versions: +Python 2.7
___
Python tracker
<http://bugs.python.org/issue826897>
___
___
Python-bugs-list mailing list
Unsubscribe:
Andres Riancho added the comment:
Django's issue [0] shows the ugly code people write to work around this python
bug.
[0] https://code.djangoproject.com/ticket/15863
--
___
Python tracker
<http://bugs.python.org/iss
Andres Riancho added the comment:
Well, closing this as wont-fix is far from ideal. +4 years have past from the
last activity in this issue but people are still being hit by this issue.
In my case I'm not creating any special sub-class, I just use one of Python's
built-in libs:
Andres Riancho added the comment:
Is this a duplicate for http://bugs.python.org/issue10015
#10015 ?
--
nosy: +Andres.Riancho
___
Python tracker
<http://bugs.python.org/issue14
New submission from Andres Riancho:
In pool.py, the worker function reads as follows:
http://svn.python.org/view/python/trunk/Lib/multiprocessing/pool.py?view=markup
"""
68 job, i, func, args, kwds = task
69 try:
70 result = (True, fun
Andres Riancho added the comment:
Yes, the traceback was in my code because as I stated before: "my w3af code had
a section of urllib2's code in logHandler.py" in other words, I copy+pasted a
section of urllib2 into my code.
Can't provide a
Andres Riancho added the comment:
One more comment to be added. Please take a look at the following [0] w3af bug
report. The interesting part starts at "[ Sun Nov 28 01:25:47 2010 - debug ]
Traceback (most recent call last):".
In there you'll find that my w3af code had a secti
Andres Riancho added the comment:
Please take a deeper look. I think you're trusting the "old code" more than my
bug report. Some things to keep in mind:
* The "headers" parameter is a dict. It will never have a getheaders method
* The If you search the whole u
New submission from Andres Riancho :
Buggy Code:
"""
def http_error_302(self, req, fp, code, msg, headers):
# Some servers (incorrectly) return multiple Location headers
# (so probably same goes for URI). Use first header.
if '
Andres Riancho added the comment:
The problem is still there in 2.7:
>>> urlparts = urlparse.urlparse('C:\\boot.ini')
>>> urlparts
('c', '', '\\boot.ini', '', '', '')
>>> if not urlparts.path:
..
New submission from Andres Riancho :
Buggy code:
"""
if 'location' in headers:
newurl = headers.getheaders('location')[0]
elif 'uri' in headers:
newurl = headers.getheaders('uri')[0]
els
Andres Riancho <[EMAIL PROTECTED]> added the comment:
- Problem: The secure flag of cookies is ignored by the load method.
- Why is it related to this issue? Because the secure flag is a name
without a value:
pie=good; other=thing; secure
- Why is it bad?
Because the RFC says that we
Andres Riancho <[EMAIL PROTECTED]> added the comment:
My problem, and the problem if the original bug reporter (sirilyan) is
that the load method ignores names that don't have values. Quoting the
original bug report:
>>> import Cookie
>>> q = Cookie.SimpleCookie(&
Andres Riancho <[EMAIL PROTECTED]> added the comment:
The RFC I'm talking about is: http://www.ietf.org/rfc/rfc2109.txt
___
Python tracker <[EMAIL PROTECTED]>
<http://bugs.pyt
Andres Riancho <[EMAIL PROTECTED]> added the comment:
Sorry to bother you guys after so much time, but I think that there is
at least one bit of the RFC that isn't respected by this "name=value"
thing... If we look at the RFC we'll see this:
cookie-av =
Andres Riancho added the comment:
As I said in my original bug report, if you don't remove the
content-length header or add the data, you are sending an invalid request:
START Request=
GET http://f00/1.php HTTP/1.1
Content-length: 63
Accept-encoding: identity
Accept: */*
User-
Andres Riancho added the comment:
According to the RFC:
If urllib2 gets a 302 in response to a request, it MUST send the *same*
request to the URI specified in the Location header, without modifying
the method, headers, or any data (urllib2 is not RFC compliant here)
In urllib2, a 301 and a
Andres Riancho added the comment:
As mentioned in the RFC, and quoted by orsenthil, "however, most
existing user agent implementations treat 302 as if it were a 303
response", which is true for urllib2.py too ( see line 585 ):
http_error_301 = http_error_303 = http
Changes by Andres Riancho:
--
title: urllib 302 POST -> urllib2 302 POST
__
Tracker <[EMAIL PROTECTED]>
<http://bugs.python.org/issue1401>
__
___
Python-bugs-li
New submission from Andres Riancho:
There is an error in urllib2 when doing a POST request to a URI that
responds with a 302 redirection. The problem is in urllib2.py:536, where
the HTTPRedirectHandler creates the new Request based on the original one:
newurl = newurl.replace
Andres Riancho
added the comment:
I think this should be reopened. The findall call is running for 3 hours
now. I think that it's a clear case of an infinite loop.
__
Tracker &l
Andres Riancho
added the comment:
Have you tested it ?
Is the re.findall() finishing it's work ? I left it working for 5
minutes or more, and got no response.
Cheers,
__
Tracker &l
24 matches
Mail list logo