Hi ,
i have nested case class that i wanted to fill with data read from parquet
but i got this error : cannot resolve '`fix`' given input columns:
[sizeinternalname, sap_compo08_type ]
case class Fix(a:String,b:String)
case class Acc(fix:Fix, c:String)
spark.read.parquet("path").as[Acc]
it
user@ is the right place for these types of questions.
As the error says, you have a case class that defines a schema
including columns like 'fix' but these don't appear to be in your
DataFrame. It needs to match.
On Wed, Sep 4, 2019 at 6:44 AM El Houssain ALLAMI
wrote:
>
> Hi ,
>
> i have nested
Zookeeper client is/was netty 3, AFAIK, so if you want to use it for
anything, it ends up on the CP
On Tue, Sep 3, 2019 at 5:18 PM Shixiong(Ryan) Zhu
wrote:
> Yep, historical reasons. And Netty 4 is under another namespace, so we can
> use Netty 3 and Netty 4 in the same JVM.
>
> On Tue, Sep 3,
Yes that's right. I don't think Spark's usage of ZK needs any ZK
server, so it's safe to exclude in Spark (at least, so far so good!)
On Wed, Sep 4, 2019 at 8:06 AM Steve Loughran
wrote:
>
> Zookeeper client is/was netty 3, AFAIK, so if you want to use it for
> anything, it ends up on the CP
>
>
Hey everyone,
I'd like to call for a vote on SPARK-27495 SPIP: Support Stage level
resource configuration and scheduling
This is for supporting stage level resource configuration and
scheduling. The basic idea is to allow the user to specify executor
and task resource requirements for each stage