[issue37977] Big red pickle security warning should stress the point even more

2019-08-29 Thread Daniel Pope


New submission from Daniel Pope :

CVEs related to unpickling untrusted data continue to come up a few times a 
year:

https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=pickle

This is certainly the tip of the iceberg. In a previous role I noted several 
internal services that could be compromised with maliciously crafted pickles. 
In my current role I can already see two internal services that look 
vulnerable. And in both organisations, little attention was paid to pickle data 
exchanged with other users over network filesystems, which may allow privilege 
escalation.

Chatting to Alex Willmer after his Europython talk in 2018 
(https://github.com/moreati/pickle-fuzz/blob/master/Rehabilitating%20Pickle.pdf)
 we discussed that the red warning in the docs is still not prominent enough, 
even after moving it to the top of the page in 
https://bugs.python.org/issue9105.

The warning currently says:

"Warning: The pickle module is not secure against erroneous or maliciously 
constructed data. Never unpickle data received from an untrusted or 
unauthenticated source."

I would suggest several improvements:

* Simpler, more direct English.
* Explain the severity of vulnerability that doing this will cause.
* Link to the hmac module which can be used to prevent tampering.
* Link to the json module which is safer if less powerful.
* Simply making the red box bigger (adding more text) will increase the 
prominence of the warning.

--
assignee: docs@python
components: Documentation
messages: 350777
nosy: docs@python, lordmauve
priority: normal
severity: normal
status: open
title: Big red pickle security warning should stress the point even more
type: security
versions: Python 3.8, Python 3.9

___
Python tracker 
<https://bugs.python.org/issue37977>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37977] Big red pickle security warning should stress the point even more

2019-08-29 Thread Daniel Pope


Change by Daniel Pope :


--
keywords: +patch
pull_requests: +15271
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/15595

___
Python tracker 
<https://bugs.python.org/issue37977>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44963] anext_awaitable is not a collections.abc.Generator

2021-08-20 Thread Daniel Pope


New submission from Daniel Pope :

The anext_awaitable object returned by anext(..., default) does not support 
.send()/.throw(). It only supports __next__().

So we can pass messages from the suspending coroutine to the event loop but not 
from the event loop to the suspending coroutine.

trio and curio rely on both directions working. (I don't know about asyncio.)

For example, this trio code fails:

import trio

async def produce():
   for v in range(3):
   await trio.sleep(1)
   yield v

async def consume():
   p = produce()
   while True:
print(await anext(p, 'finished'))

trio.run(consume)

raising AttributeError: 'anext_awaitable' object has no attribute 'send'.

I realise that any awaitable that wants to await another awaitable must return 
not an iterator from __await__() but something that implements the full PEP-342 
generator protocol. Should PEP-492 section on __await__()[1] say something 
about that?

[1] https://www.python.org/dev/peps/pep-0492/#await-expression

--
components: Library (Lib)
messages: 399982
nosy: lordmauve
priority: normal
severity: normal
status: open
title: anext_awaitable is not a collections.abc.Generator
type: behavior
versions: Python 3.10, Python 3.11

___
Python tracker 
<https://bugs.python.org/issue44963>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue34137] Add Path.lexist() to pathlib

2018-07-17 Thread Daniel Pope


New submission from Daniel Pope :

When using pathlib to manipulate paths that may be symlinks or regular files, a 
pattern that comes up frequently is this expression:

path.is_symlink() or path.exists()

os.path.lexists(path) can be used for this, but when using pathlib going back 
to os.path for this seems like defeat.

--
components: Library (Lib)
messages: 321812
nosy: lordmauve
priority: normal
severity: normal
status: open
title: Add Path.lexist() to pathlib
type: enhancement
versions: Python 3.7

___
Python tracker 
<https://bugs.python.org/issue34137>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33123] Path.unlink should have a missing_ok parameter

2018-07-17 Thread Daniel Pope


Change by Daniel Pope :


--
nosy: +lordmauve

___
Python tracker 
<https://bugs.python.org/issue33123>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33123] Path.unlink should have a missing_ok parameter

2018-07-17 Thread Daniel Pope


Daniel Pope  added the comment:

This would be a shortcut in the common case that you simply want an idempotent 
"make sure this file/symlink is gone" operation.

There are already boolean options to enable idempotent behaviour in several 
pathlib implementations, such as mkdir(exist_ok=True) and touch(exist_ok=True). 
write_bytes() and write_text() are also idempotent. unlink() aligns well with 
this.

Because this operation doesn't exist, developers are tempted to write

if path.exists():
path.unlink()

which both has a TOCTTOU bug and doesn't correctly handle symlinks.

--

___
Python tracker 
<https://bugs.python.org/issue33123>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue34662] tarfile.TarFile may write corrupt files if not closed

2018-09-13 Thread Daniel Pope


New submission from Daniel Pope :

A tarfile.TarFile object open for writing may silently write corrupt tar files 
if it is destroyed before being closed.

While explicitly calling close() or using the object as a context manager is 
recommended, I would not expect this in basic usage.

There are two steps needed for a TarFile to be closed properly:

* According to https://github.com/python/cpython/blob/3.7/Lib/tarfile.py#L1726, 
two zero blocks must be written (though GNU tar seems to work even if these are 
absent)
* The underlying fileobj (an io.BufferedWriter) must then be flushed

A BufferedWriter is flushed in its __del__(); the problem is that TarFile 
objects form a reference cycle with their TarInfo members due to this line, 
which has the comment "Not Needed": 
https://github.com/python/cpython/blob/3.7/Lib/tarfile.py#L1801

Under PEP-442, when the TarFile becomes unreferenced the following Cycle 
Isolate is formed:

TarInfo <=> TarFile -> BufferedWriter -> FileIO

Finalisers for these objects are run in an undefined order. If the FileIO 
finaliser is run before the BufferedWriter finaliser, then the fd is closed, 
buffered data in the BufferedWriter is not committed to disk, and the tar file 
is corrupt.

Additionally, while ResourceWarning is issued if the BufferedWriter or FileIO 
are left unclosed, no such warning is emitted by the TarFile.

--
components: Library (Lib)
messages: 325266
nosy: lordmauve
priority: normal
severity: normal
status: open
title: tarfile.TarFile may write corrupt files if not closed
type: behavior
versions: Python 3.7

___
Python tracker 
<https://bugs.python.org/issue34662>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue34662] tarfile.TarFile may write corrupt files if not closed

2018-09-13 Thread Daniel Pope


Daniel Pope  added the comment:

I have several suggestions for steps to address this:

1. Don't create reference cycles. TarInfo.tarfile does not appear to be a 
documented attribute 
(https://docs.python.org/3/library/tarfile.html#tarinfo-objects) and could 
perhaps be deleted.
2. Issue a ResourceWarning in TarFile.__del__() if the TarFile was not closed 
prior to finalisation. ResourceWarnings are ignored by default but this would 
help when debugging. Given that the file may be corrupted perhaps something 
more visible than a ResourceWarning is required.
3. Make TarFile.__del__() close the TarFile cleanly. This is only possible if 
we can guarantee the underlying fileobj is finalized later (eg. because we have 
eliminated the reference cycle).

--

___
Python tracker 
<https://bugs.python.org/issue34662>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue34272] Reorganize C API tests

2018-10-24 Thread Daniel Pope


Change by Daniel Pope :


--
pull_requests: +9414

___
Python tracker 
<https://bugs.python.org/issue34272>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue24718] Specify interpreter when running in IDLE

2015-07-25 Thread Daniel Pope

New submission from Daniel Pope:

I maintain a library called Pygame Zero which allows beginner programmers to 
start writing games without lines of boilerplate importing libraries, creating 
event loops and so on.

To support these features, Pygame Zero provides the 'pgzrun' command:

pgzrun