[ https://issues.apache.org/jira/browse/BEAM-14337?focusedWorklogId=776307&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-776307 ]
ASF GitHub Bot logged work on BEAM-14337: ----------------------------------------- Author: ASF GitHub Bot Created on: 31/May/22 13:05 Start Date: 31/May/22 13:05 Worklog Time Spent: 10m Work Description: yeandy commented on code in PR #17470: URL: https://github.com/apache/beam/pull/17470#discussion_r885612923 ########## sdks/python/apache_beam/ml/inference/sklearn_inference.py: ########## @@ -42,8 +42,8 @@ class ModelFileType(enum.Enum): class SklearnInferenceRunner(InferenceRunner): - def run_inference(self, batch: List[numpy.ndarray], - model: Any) -> Iterable[PredictionResult]: + def run_inference(self, batch: List[numpy.ndarray], model: Any, + **kwargs) -> Iterable[PredictionResult]: Review Comment: Not needed. Fixed. Issue Time Tracking ------------------- Worklog Id: (was: 776307) Time Spent: 6h 50m (was: 6h 40m) > Support **kwargs for PyTorch models. > ------------------------------------ > > Key: BEAM-14337 > URL: https://issues.apache.org/jira/browse/BEAM-14337 > Project: Beam > Issue Type: Sub-task > Components: sdk-py-core > Reporter: Anand Inguva > Assignee: Andy Ye > Priority: P2 > Time Spent: 6h 50m > Remaining Estimate: 0h > > Some models in Pytorch instantiating from torch.nn.Module, has extra > parameters in the forward function call. These extra parameters can be passed > as Dict or as positional arguments. > Example of PyTorch models supported by Hugging Face -> > [https://huggingface.co/bert-base-uncased] > [Some torch models on Hugging > face|https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/modeling_bert.py] > Eg: > [https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertModel] > {code:java} > inputs = { > input_ids: Tensor1, > attention_mask: Tensor2, > token_type_ids: Tensor3, > } > model = BertModel.from_pretrained("bert-base-uncased") # which is a > # subclass of torch.nn.Module > outputs = model(**inputs) # model forward method should be expecting the keys > in the inputs as the positional arguments.{code} > > [Transformers|https://pytorch.org/hub/huggingface_pytorch-transformers/] > integrated in Pytorch is supported by Hugging Face as well. > -- This message was sent by Atlassian Jira (v8.20.7#820007)