On Mon, Apr 27, 2015 at 10:29:25AM +0200, Roberto E. Vargas Caballero wrote: > > typedef struct { > > uint_least32_t u; > > uint_least32_t mode:12; > > uint_least32_t fg:10; > > uint_least32_t bg:10; > > } Glyph; > > The size of this struct is only one byte less than if the > same of the struct using shorts. You can test it if you > want.
I'm attaching a test program I just wrote. It has three different versions of the Glyph struct, including one using ``long'' to store the UTF-32 character. The output of that program when compiled with GCC 4.8.4 on my x86_64 machine is the following: >$ gcc -std=c99 -o sizeofglyph sizeofglyph.c >$ ./sizeofglyph >sizeof(Glyph): 10 bytes >sizeof(GlyphUtf32): 16 bytes >sizeof(GlyphUtf32Packed): 8 bytes As you can see, it's actually 2 bytes less. This is because a struct is usually aligned to the maximum alignment of all fields. A 16-bit ushort has a 2-byte alignment on x86_64, so this forces the struct to also have an alignment of 2-bytes, and thus it has to insert two extra padding bytes at the end of the struct to maintain the alignment across an array of these structs. Now, 2 bytes isn't a big deal if its isolated, but if you do the math, for a terminal with 240 columuns and 120 rows, including the alternate screen buffer, that's 2 * 240 * 120 * 2 = 115,200 bytes that are being wasted--nearly 128KB--which is significant.
#include <stdint.h> #include <stdio.h> typedef struct { char c[4]; unsigned short mode; unsigned short fg; unsigned short bg; } Glyph; typedef struct { long u; unsigned short mode; unsigned short fg; unsigned short bg; } GlyphUtf32; typedef struct { uint_least32_t u; uint_least32_t mode:12; uint_least32_t fg:10; uint_least32_t bg:10; } GlyphUtf32Packed; int main() { printf("sizeof(Glyph): %zu bytes\n", sizeof(Glyph)); printf("sizeof(GlyphUtf32): %zu bytes\n", sizeof(GlyphUtf32)); printf("sizeof(GlyphUtf32Packed): %zu bytes\n", sizeof(GlyphUtf32Packed)); return 0; }
signature.asc
Description: Digital signature