On Oct 17, 7:02 am, Michele <[EMAIL PROTECTED]> wrote: > Hi, > I write a simple encoder in python and Java; they do the same > computations, with the same inputs
No they don't. > however they won't produce the same > output. > Let me explain with code. You have a strange understanding of the word "explain". > > First of all, you need a test file for input: > $ dd if=/dev/urandom of=test.img bs=1048576 count=1 > > I have attached the code. > As you can see we have xor-red the same inputs (bitlists are the same, > thus the selected blocks to xor are the same - you can easily see it, > whenever a block is xor-red both programs will print out its hash). > But here comes the strange: the random_block that the > create_random_block function returns is not the same: in Java this block > has an hash which is different from the Python one. > Why? > > Thank you > > [test.py1K ] > random_block = ['0']*blocksize This initialises each element to '0'. ord('0') is 48, not 0. Either I'm hallucinating, or I pointed this out to you in response to your previous posting (within the last few days). > random_block[j] = chr(ord(random_block[j]) ^ > ord(block[j])) > > [Test.java2K ] > byte[] random_block = new byte[16384]; Presumably java initialises each element to 0 (or maybe random gibberish); it is highly unlikely to be 48! > random_block[j] = (byte) > (random_block[j] ^ block[j]); -- http://mail.python.org/mailman/listinfo/python-list