Hi,

i'm using gpgme for encryption and signing of updates for an embedded linux system.

The old version worked fine, now i have to port the stuff to a new system with fewer RAM and bigger update files. This generates some problems , if the maximum RAM is used, caused by to big update files. In my workflow, i'm compare the signer keys of the updatefiles and the expected keys, but the "gpgme_get_key" functions fails with a "Invalid crypto engine" error. Is the error message correct at this position? The gpgme_op_decrypt_verify, gpgme_op_decrypt_result and gpgme_op_verify_result work before properly.

The size of the update files will be reduced before the release, so i'm hoping, the system will finaly work. But this is very critical, if someday, the applications, which are running on the system, using to much RAM. So 'm now trying to reduce the memory usage for the complete update process.

My gpgme encrypted updatefile contains a single tar file, the tar file contains all necessary files for the update. I'm using libtar to extract the files to the installation path, libtar is working directly on the gpgme memory buffer. The complete update is stored during the process in the memory. This leads to the problems with to few RAM. Is there a way, to reduce the size of the gpgme memory buffer, maybe by reloading (and redecrypting ) data chunks from the orignal crypted update file? The original update file is loaded via stream. I found the " Callback Based Data Buffers" in the documation. It is possible, to use this buffers, to realize the reloading behavior?

Thanks in advance,

Marcel

--
Dipl. Ing (FH) Marcel Behlau
(Software Developer)

ELFIN GmbH
Siegburger Straße 215
50679 Köln
Germany

Tel: +49 (221) 6778932-0
Fax: +49 (221) 6778932-2
marcel.beh...@elfin.de
www.elfin.de


_______________________________________________
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users

Reply via email to