On Oct 2, 11:27 am, "Aaron \"Castironpi\" Brady" <[EMAIL PROTECTED]> wrote: > On Oct 1, 2:50 pm, est <[EMAIL PROTECTED]> wrote: > > > > > > > >>> import md5 > > >>> a=md5.md5() > > >>> import pickle > > >>> pickle.dumps(a) > > > Traceback (most recent call last): > > File "<stdin>", line 1, in <module> > > File "C:\Python25\lib\pickle.py", line 1366, in dumps > > Pickler(file, protocol).dump(obj) > > File "C:\Python25\lib\pickle.py", line 224, in dump > > self.save(obj) > > File "C:\Python25\lib\pickle.py", line 306, in save > > rv = reduce(self.proto) > > File "C:\Python25\lib\copy_reg.py", line 69, in _reduce_ex > > raise TypeError, "can't pickle %s objects" % base.__name__ > > TypeError: can't pickle HASH objects > > > Why can't I pickle a md5 object? Is it because md5 algorithm needs to > > read 512-bits at a time? > > > I need to md5() some stream, pause(python.exe quits), and resume > > later. It seems that the md5 and hashlib in std module could not be > > serialized? > > > Do I have to implement md5 algorithm again for this special occasion? > > > Or is there anyway to assige a digest when creating md5 objects? > > Can you just pickle the stream, the part of it you've read so far?- Hide > quoted text - > > - Show quoted text -
wow. It's giga-size file. I need stream reading it, md5 it. It may break for a while. -- http://mail.python.org/mailman/listinfo/python-list