Hello everyone, I'm using SparkSQL and would like to understand how can I determine right value for "spark.sql.shuffle.partitions" parameter? For example if I'm joining two RDDs where first has 10 partitions and second - 60, how big this parameter should be?
Thank you, Yuri