[ https://issues.apache.org/jira/browse/BEAM-14337?focusedWorklogId=776308&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-776308 ]
ASF GitHub Bot logged work on BEAM-14337: ----------------------------------------- Author: ASF GitHub Bot Created on: 31/May/22 13:06 Start Date: 31/May/22 13:06 Worklog Time Spent: 10m Work Description: yeandy commented on code in PR #17470: URL: https://github.com/apache/beam/pull/17470#discussion_r885613717 ########## sdks/python/apache_beam/ml/inference/pytorch.py: ########## @@ -39,25 +41,63 @@ class PytorchInferenceRunner(InferenceRunner): def __init__(self, device: torch.device): self._device = device - def run_inference(self, batch: List[torch.Tensor], - model: torch.nn.Module) -> Iterable[PredictionResult]: + def _convert_to_device(self, examples: torch.Tensor) -> torch.Tensor: + """ + examples, which may or may not be attached to GPU during creation time, need Review Comment: Thanks for the feedback. This is much cleaner. Fixed. Issue Time Tracking ------------------- Worklog Id: (was: 776308) Time Spent: 7h (was: 6h 50m) > Support **kwargs for PyTorch models. > ------------------------------------ > > Key: BEAM-14337 > URL: https://issues.apache.org/jira/browse/BEAM-14337 > Project: Beam > Issue Type: Sub-task > Components: sdk-py-core > Reporter: Anand Inguva > Assignee: Andy Ye > Priority: P2 > Time Spent: 7h > Remaining Estimate: 0h > > Some models in Pytorch instantiating from torch.nn.Module, has extra > parameters in the forward function call. These extra parameters can be passed > as Dict or as positional arguments. > Example of PyTorch models supported by Hugging Face -> > [https://huggingface.co/bert-base-uncased] > [Some torch models on Hugging > face|https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/modeling_bert.py] > Eg: > [https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertModel] > {code:java} > inputs = { > input_ids: Tensor1, > attention_mask: Tensor2, > token_type_ids: Tensor3, > } > model = BertModel.from_pretrained("bert-base-uncased") # which is a > # subclass of torch.nn.Module > outputs = model(**inputs) # model forward method should be expecting the keys > in the inputs as the positional arguments.{code} > > [Transformers|https://pytorch.org/hub/huggingface_pytorch-transformers/] > integrated in Pytorch is supported by Hugging Face as well. > -- This message was sent by Atlassian Jira (v8.20.7#820007)