Is there a good way to emulate or determine the blocks that would exist on
HDFS for a given file. if I have a 135mb file and my block size is 128,
does it stand to say I would have 2 blocks, block A is byte 0-134217728 and
Block b is 134217729-141557761 deterministically? I am attempting to
calculate checksum in a manner that matches the method defined in
DFSClient.getChecksum(Path p).

Reply via email to