Thanks a lot !
My VS7 cheated me :-) It doesn't display i2d_ASN1_INTEGER() in it's "Code 
Insight" . But compilation is ok.
Now I'm able to use CryptEncodeObject to convert from DER encoded integer 
to CRYPT_INTEGER_BLOB which is used internally in Win Crypto API. Crypto 
API keeps internally multi byte integers in little-endian order and Open 
SSL in big-endian, am I right ?

However, still I don't see any reason why this function increments its 
second argument ? And why to the first byte after the DER-encoded INTEGER 
(it's out of preallocated memory) ? If you please enlightened me I would 
be grateful for.

Best regards
Andrzej





"Frank Balluffi" <[EMAIL PROTECTED]>
2004-02-11 14:41

 
        To:     [EMAIL PROTECTED]
        cc:     [EMAIL PROTECTED]
        Subject:        Re: How to convert internal ASN1_INTEGER into little endian 
content octets



Andrzej, 

Call i2d_ASN1_INTEGER to DER-encode an ASN.1 INTEGER. Pass 0 or NULL as 
the second argument to i2d_ASN1_INTEGER to determine the length of the 
DER-encoded INTEGER. If you pass a non-zero value as the second argument 
to i2d_ASN1_INTEGER, the function will DER-encode the INTEGER and 
increment the second argument to the first byte after the DER-encoded 
INTEGER. Looks like (I did not compile this code): 

    ASN1_INTEGER *  integer; /* points to an ASN1_INTEGER */ 
    unsigned char * der     = NULL; 
    unsigned char * derNext = NULL; 
    int             length  = 0; 

    length = i2d_ASN1_INTEGER(integer, 0); 

    if (length <= 0) 
        goto error; 

    der = OPENSSL_malloc(length); 

    if (!der) 
        goto error; 

   /* 
    Because i2d functions modify their second argument, use the variable 
    derNext. 
    */ 

    derNext = der; 
    length = i2d_ASN1_INTEGER(integer, &derNext); 

    if (length <= 0) 
        goto error; 

Frank 




"Andrzej Posiadala" <[EMAIL PROTECTED]> 
Sent by: [EMAIL PROTECTED] 
02/11/2004 07:57 AM 
Please respond to openssl-users 
        
        To:        [EMAIL PROTECTED] 
        cc:         
        Subject:        How to convert internal ASN1_INTEGER into little 
endian content octets



Hi ,

i'm trying to convert ASN1_INTEGER (specifically certificate serial 
number) into its DER representation.
I'm using i2c_ASN1_INTEGER - and if it's the right function - then I don't 

understand why it moves a pointer passed to it as second parameter behind 
reserved memory.
Here is what I'm doing:

int size;
ASN1_INTEGER * serial;
unsigned char * serialNumberDER;

size = i2c_ASN1_INTEGER(serial, NULL);
serialNumberDER = new unsigned char[*size];
size = i2c_ASN1_INTEGER(serial, & serialNumberDER);

The function has this code at the end:

*pp+=ret;

where pp is a pointer to serialNumberDER, so in effect it moves 
serialNumberDER behind created array of unsigned char.

Please, explain it to me.
Thanks in advance.

______________________________________________________________________
OpenSSL Project                                 http://www.openssl.org
User Support Mailing List                    [EMAIL PROTECTED]
Automated List Manager                           [EMAIL PROTECTED]




______________________________________________________________________
OpenSSL Project                                 http://www.openssl.org
User Support Mailing List                    [EMAIL PROTECTED]
Automated List Manager                           [EMAIL PROTECTED]

Reply via email to