>
> You could generate your own case classes which have more than the 22
> fields, though.
Actually that is not possible with case classes in Scala 2.10, you would
have to use a normal class if you have more than 22 fields.
This constraint was removed in 2.11.
On Wed, Oct 14, 2015 at 11:42 AM, T
I was able to reproduce the error with some more queries by now. However
it seems like it is only a problem for Flink's local mode. During cluster
execution everything works just fine.
Regards, Max
> Thanks a lot for the help.
>
> I was able to apply the Tuple1 functionality to fix my problem. I
Thanks a lot for the help.
I was able to apply the Tuple1 functionality to fix my problem. I also
moved up to Flink 0.9.
However I have another problem executing generated Scala programs. It
seems like a Scala program executed with a Flink 0.9 Job Manager only has
a limited amount of usable opera
If you're using Scala, then you're bound to a maximum of 22 fields in a
tuple, because the Scala library does not provide larger tuples. You could
generate your own case classes which have more than the 22 fields, though.
On Oct 14, 2015 11:30 AM, "Ufuk Celebi" wrote:
>
> > On 13 Oct 2015, at 16:
> On 13 Oct 2015, at 16:06, schul...@informatik.hu-berlin.de wrote:
>
> Hello,
>
> I am currently working on a compilation unit translating AsterixDB's AQL
> into runnable Scala code for Flink's Scala API. During code generation I
> discovered some things that are quite hard to work around. I am