Yep... I was thinking about that... but it seems to work w JSON
jg
> On Oct 4, 2016, at 19:17, Peter Figliozzi wrote:
>
> It's pretty clear that df.col(xpath) is looking for a column named xpath in
> your df, not executing an xpath over an XML document as you wish. Try
> constructing a UDF
It's pretty clear that df.col(xpath) is looking for a column named xpath in
your df, not executing an xpath over an XML document as you wish. Try
constructing a UDF which applies your xpath query, and give that as the
second argument to withColumn.
On Tue, Oct 4, 2016 at 4:35 PM, Jean Georges Per
Spark 2.0.0
XML parser 0.4.0
Java
Hi,
I am trying to create a new column in my data frame, based on a value of a sub
element. I have done that several time with JSON, but not very successful in
XML.
(I know a world with less format would be easier :) )
Here is the code:
df.withColumn("Fulfill
41 PM
> *To: *srikanth.je...@gmail.com
> *Cc: *user@spark.apache.org
> *Subject: *Re: AnalysisException exception while parsing XML
>
>
>
> Once you get to the 'Array' type, you got to use explode, you cannot to
> the same traversing.
>
>
>
> On Wed, A
How do we explode nested arrays?
Thanks,
Sreekanth Jella
From: Peyman Mohajerian
Once you get to the 'Array' type, you got to use explode, you cannot to the
same traversing.
On Wed, Aug 31, 2016 at 2:19 PM, wrote:
> Hello Experts,
>
>
>
> I am using Spark XML package to parse the XML. Below exception is being
> thrown when trying to *parse a tag which exist in arrays of arra
Hello Experts,
I am using Spark XML package to parse the XML. Below exception is being thrown
when trying to parse a tag which exist in arrays of array depth. i.e. in this
case subordinate_clerk. .duty.name
With below sample XML, issue is reproducible:
1
mgr1
2005-07-31