Hi all, What is the best approach for iterating all columns in a pyspark dataframe?I want to apply some conditions on all columns in the dataframe. Currently I am using for loop for iteration. Is it a good practice while using Spark and I am using Spark 3.0 Please advice
Thanks, Devi