Hi all,
Thanks for actively voting. Unfortunately, we found a very ancient bug
(SPARK-35278), and the fix (https://github.com/apache/spark/pull/32404) is
going to be merged soon. We may fail this RC3.
I will go to cut RC4 as soon as the fix is merged.
Thank you!
--
Sent from: http://apache-sp
Thanks for starting this good discussion. You can add multiple columns
with select to avoid calling withColumn multiple times:
val newCols = Seq(col("*"), lit("val1").as("key1"), lit("val2").as("key2"))
df.select(newCols: _*).show()
withColumns would be a nice interface for less technical Spark