On Nov 30, 2008, at 6:26 PM, Duane Ellis wrote:
What is the required work area size?
None. The flash methods attempt to allocate space for the flash algorithm and a data block area. They start with the size needed to contain the whole image and divide that by 2 on failure. If the size gets to a certain lower threshold, they fall back to a byte-by-byte scheme.
I know that having a work area and the DCC can give a 5x to 10x downloadspeed improvement. -Duane. Reasons for my question is the following: ===== For example some chips have lots of ram. Some chips do not have much. What happens when the download goes over the top of the helper?
The algorithm takes the biggest block size available in the working area and divides the image into blocks. The on-chip algorithm is then run once per block.
If I give a 16K work area... is it all backed up?
If you marked a work area as needing to be backed up, it will be backed up when the working area buffer is allocated. It should only back up the areas that are actively in use.
Or only the little bit that is needed?At some point - little downloads take longer to to setup the helper thento just do. Is OpenOCD smart about it?
There is a minimum threshold on size for the block algorithms. I don't remember offhand what they are, but below that, it does byte-by- byte copying. Above that, it attempts to use blocks up to the size of the image.
===== It would be nice to document the above in the user manual. Duh! _______________________________________________ Openocd-development mailing list Openocd-development@lists.berlios.de https://lists.berlios.de/mailman/listinfo/openocd-development
-- Rick Altherr [EMAIL PROTECTED]"He said he hadn't had a byte in three days. I had a short, so I split it with him."
-- Unsigned
smime.p7s
Description: S/MIME cryptographic signature
_______________________________________________ Openocd-development mailing list Openocd-development@lists.berlios.de https://lists.berlios.de/mailman/listinfo/openocd-development