Hello @masahi 

I have asked my questions and found out solution by myself in 

[quote="masahi, post:2, topic:11978"]
Issue: Converting model from pytorch to relay model - #5 by popojames
[/quote]


I am facing prim::DictConstruct issue again. because I am customizing BERT 
model with BERT config, and there is no  `return_dict` option I can use in this 
regard.
Can you explain how exactly you mean "you can manually turn it into a tuple"?

Here is my setting:
 
    from transformers import (BertTokenizer, BertConfig, BertModel)

    np_input = torch.tensor(np.random.uniform(size=[1, 128], low=0, 
high=128).astype("int32"))

    BERTconfig = BertConfig(hidden_size=768, 
num_hidden_layers=round(12*depth_multipliers), num_attention_heads=12, 
intermediate_size=3072)

    model = BertModel(BERTconfig)
    model = model.eval()
    traced_script_module = torch.jit.trace(model, np_input, strict=False).eval()
    mod, params = relay.frontend.from_pytorch(traced_script_module, 
input_infos=shape_dict)


Thanks!





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/how-to-deal-with-prim-dictconstruct/11978/4)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/5c75f613353be5140af0499dbca6bbdb5d2c9c8f602f1ca942be67bf4c26ffd7).

Reply via email to