On 2011-06-06 21:06, Džen wrote: > Pretty much answers my question. In my use case it'd be easier to use > delimiters like \0 or \n, due to the data not being binary. However now > I wonder, which method would need more cpu time? I suppose that when > using delimiters there isn't a easier way than using fgetc(), reading > through the whole data stream. Hard-coded field lengths would be faster > if the fields contain a lot of characters I guess.
It would probably be easier/faster to read the whole file into a buffer ahead of time, then parse it afterwards. That way you can read in larger chunks and don't have to do a whole bunch of calls to fgetc. There have been a billion functions written to do this already so you can probably just use one of those.. Regardless of what you do, the time it takes to parse the file will probably be insignificant compared to the amount of time it takes to read the file from disk. If you use \0 as the delimiter for all your cells, then you can just use pointers into the buffer for your table because all the strings are already null-terminated. For example: char *table[rows][cols]; char *data = readfile(stdin); for (r in rows) for (c in cols) table[r][c] = data; data += strlen(data)+1;
pgpbGTzenwfDx.pgp
Description: PGP signature