Hello folks,
I would like to ask Spark devs if and it possible to define explicitly the
key/value types for a map (Spark 3.3.0) as shown below:
import org.apache.spark.sql.functions.{expr, collect_list}
> val df = Seq(
> (1, Map("k1" -> "v1", "k2" -> "v3")),
> (1, Map("k3" -> "v3")),
> (2,
Hi Alex,
You can cast the initial value to the desired type
val mergeExpr = expr("aggregate(data, cast(map() as mapstring>), (acc, i) -> map_concat(acc, i))")
On 8/27/22 13:06, Alexandros Biratsis wrote:
Hello folks,
I would like to ask Spark devs if and it possible to define explicitly