On Wed, Mar 24, 2010 at 5:15 AM, Philip Potter
wrote:
> On 24 March 2010 00:56, Eric Veith1 wrote:
>> Sam wrote on 03/23/2010 11:18:11 PM:
>>> Could you use a file of random data? You can create one of those really
>>> easy: dd if=/dev/urandom of=ranfile bs=
>>
>> Theoretically, yes, of cour
Hi!
Theoretically, yes, of course I could just try to create an arbitrary
sized file from /dev/urandom via dd. I hoped there would be an equally
both fast and elegant solution possible as with the C approach (malloc
without init). Bob's idea of just reading and piping files from /bin or
/usr/
On 24 March 2010 00:56, Eric Veith1 wrote:
> Sam wrote on 03/23/2010 11:18:11 PM:
>> Could you use a file of random data? You can create one of those really
>> easy: dd if=/dev/urandom of=ranfile bs=
>
> Theoretically, yes, of course I could just try to create an arbitrary
> sized file from /
Sam wrote on 03/23/2010 11:18:11 PM:
> Could you use a file of random data? You can create one of those really
> easy: dd if=/dev/urandom of=ranfile bs=
Theoretically, yes, of course I could just try to create an arbitrary
sized file from /dev/urandom via dd. I hoped there would be an equal
Eric Veith1 wrote:
Hello list,
this is rather unusual: I want a chunk of random garbage, and I want it
fast. The background is that I have a streaming test, and to run into some
intelligent read-ahead/write-behind/caching algorithm, I need random
stuff. /dev/null is fast, but obviously won't
From: Eric Veith1
> this is rather unusual: I want a chunk of random garbage, and I want
it
> fast. The background is that I have a streaming test, and to run into
some
> intelligent read-ahead/write-behind/caching algorithm, I need random
> stuff. /dev/null is fast, but obviously won't do it.