ueshin commented on code in PR #50084: URL: https://github.com/apache/spark/pull/50084#discussion_r1970931466
########## python/pyspark/worker.py: ########## @@ -160,6 +160,10 @@ def wrap_arrow_batch_udf(f, args_offsets, kwargs_offsets, return_type, runner_co import pandas as pd func, args_kwargs_offsets = wrap_kwargs_support(f, args_offsets, kwargs_offsets) + zero_arg_exec = False + if len(args_kwargs_offsets) == 0: + args_kwargs_offsets = (0,) # Series([pyspark._NoValue, ...]) is used for 0-arg execution. Review Comment: This comment may not be always true, e.g., when there are 0-arg and 1+arg UDFs are mixed. I guess it still works, though. ########## python/pyspark/worker.py: ########## @@ -160,6 +160,10 @@ def wrap_arrow_batch_udf(f, args_offsets, kwargs_offsets, return_type, runner_co import pandas as pd func, args_kwargs_offsets = wrap_kwargs_support(f, args_offsets, kwargs_offsets) + zero_arg_exec = False + if len(args_kwargs_offsets) == 0: + args_kwargs_offsets = (0,) # Series([pyspark._NoValue, ...]) is used for 0-arg execution. Review Comment: We need to add such tests if not exists. - 0-arg only - 0-arg, 1+arg - 1+arg, 0-arg ########## python/pyspark/worker.py: ########## @@ -176,6 +180,16 @@ def wrap_arrow_batch_udf(f, args_offsets, kwargs_offsets, return_type, runner_co elif type(return_type) == BinaryType: result_func = lambda r: bytes(r) if r is not None else r # noqa: E731 + if zero_arg_exec: + + def get_args(*args: pd.Series): + return [() for _ in range(len(args[0]))] Review Comment: Can we directly use `args[0]`? ```suggestion return [() for _ in range(len(args[0]))] return [() for _ in args[0]] ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org