Hi Alessandro,
Alessandro Rubini a écrit :
>>> + unsigned long *dl = (unsigned long *)dest, *sl = (unsigned long *)src;
>>>
>
>
>> Nitpick: Are you sure the casts are necessary here ?
>>
>
> Without the one on src it complains because of "const". So I write
> both for symetry.
>
Hello Chris
>> +unsigned long *dl = (unsigned long *)dest, *sl = (unsigned long *)src;
> Nitpick: Are you sure the casts are necessary here ?
Without the one on src it complains because of "const". So I write
both for symetry.
> + if ( (((ulong)dest | (ulong)src) & (sizeof(*dl) - 1)) ==
Alessandro Rubini a écrit :
> From: Alessandro Rubini
>
> If source and destination are aligned, this copies ulong values
> until possible, trailing part is copied by byte. Thanks for the details
> to Wolfgang Denk, Mike Frysinger, Peter Tyser, Chris Moore.
>
> Signed-off-by: Alessandro Rubini
>
> i think you want to drop the count from the list, otherwise we dont consume
> the leading groups of 4 bytes if count isnt a multiple of 4.
Yes, same for memset. See Wolfgang it was not 10% more? These micro
optimizations are hairy, as you need to measure them to make sure they
work.
Ok, V4 tom
On Friday 09 October 2009 05:12:20 Alessandro Rubini wrote:
> + /* while all data is aligned (common case), copy a word at a time */
> + if ( (((ulong)dest | (ulong)src | count) & (sizeof(*dl) - 1)) == 0) {
i think you want to drop the count from the list, otherwise we dont consume
the le
From: Alessandro Rubini
If source and destination are aligned, this copies ulong values
until possible, trailing part is copied by byte. Thanks for the details
to Wolfgang Denk, Mike Frysinger, Peter Tyser, Chris Moore.
Signed-off-by: Alessandro Rubini
Acked-by: Andrea Gallo
---
lib_generic/s
6 matches
Mail list logo