lC", etc)
> val filteredDF = df.select(df.columns
> .filter(colName => !colsToRemove.contains(colName))
> .map(colName => new Column(colName)): _*)
>
> which is, I think, a bit ugly.
>
> Thanks,
> Ben.
>
>
>
> --
> View this message in context:
>
.
Thanks,
Ben.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Drop-multiple-columns-in-the-DataFrame-API-tp25438.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
Hi everyone,
I was wondering if there is a better way to drop mutliple columns from a
dataframe or why there is no drop(cols: Column*) method in the dataframe
API.
Indeed, I tend to write code like this:
val filteredDF = df.drop("colA")
.drop("colB")
.drop("colC")
//etc
which is a bit