Sorry, by API you mean by use of 3rd party libraries or user code or
something else?
Thanks
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/On-convenience-methods-tp19460p19496.html
Sent from the Apache Spark Developers List mailing list archive at Na
Hi, I sometimes write convenience methods for pre-processing data frames, and
I wonder if it makes sense to make a contribution -- should this be included
in Spark or supplied as Spark Packages/3rd party libraries?
Example:
Get all fields in a DataFrame schema of a certain type.
I end up writing