Hello
 
I'm having a basic issue encoding a custom object. 
 
I would like the binary output of a new object I've created to appear in DER 
encoded output as the following TLV 

0x5F20 0x08 [8 bytes of value]
 
I think this is implicit,application specific and primitive, so I have the 
following new definitions 

 #define
ASN1_EX_TYPE(ASN1_TFLG_APPLICATION_IMPLICIT, tag, stname, field, 
type)ASN1_IMPLICIT_APPLICATION(stname, field, type, tag) 
\#defineASN1_TFLG_APPLICATION_IMPLICIT ASN1_TFLG_IMPTAG|ASN1_TFLG_APPLICATION
 
and I choose to declare my object as an ASN1_BIT_STRING like this 
 ASN1_IMPLICIT_APPLICATION(CUSTOMCERT
 
I have checked that the length of the populated bit string internally is 8. The 
encoder/decoder and ASN1Parse run through OK. However the binary result I 
actually get in DER format is this 

0x5F20 0x09 0x03 [8 bytes of value] 
 
So I have an extra '0x03' which is presumably a V_ASN1_BIT_STRING tag, which I 
don't want. This pushes the length from 8 to 9
 
So how can I lose this extra encoded byte without confusing my encoder or 
decoder ? I don't need to use an ASN1_BIT_STRING - anything that lets me 
populate with 8 arbitrary bytes is fine. There may be a better way to achieve 
the 0x5F20 tag than my guesses above.
 
thanks in advance 
 
John_CINF, subject_customcert_id, ASN1_BIT_STRING, 0x20),



______________________________________________________________________
OpenSSL Project                                 http://www.openssl.org
User Support Mailing List                    openssl-users@openssl.org
Automated List Manager                           majord...@openssl.org

Reply via email to