why not just binary uint32 width (BIG_ENDIAN) uint32 height (BIG_ENDIAN) uint32 pixel(s) (BIG_ENDIAN)
with uint64 length of pixels = width * height; If you need ascii, just dump it to some hex or base64 converter and good is... -Anselm
why not just binary uint32 width (BIG_ENDIAN) uint32 height (BIG_ENDIAN) uint32 pixel(s) (BIG_ENDIAN)
with uint64 length of pixels = width * height; If you need ascii, just dump it to some hex or base64 converter and good is... -Anselm