'bigint' is a long, not a Java BigInteger.
On Sun, Jun 28, 2020 at 5:52 AM Anwar AliKhan wrote:
>
> I wish to draw your attention for your consideration to this approach
> where the BigInt data type maps to Long without drawing an error.
>
> https://stackoverflow.com/questions/31011797/bug-in
I wish to draw your attention for your consideration to this approach
where the BigInt data type maps to Long without drawing an error.
https://stackoverflow.com/questions/31011797/bug-in-spring-data-jpa-spring-data-returns-listbiginteger-instead-of-listlon
"This is a issue with Spring data JPA
OK Thanks
On Sat, 27 Jun 2020, 17:36 Sean Owen, wrote:
> It does not return a DataFrame. It returns Dataset[Long].
> You do not need to collect(). See my email.
>
> On Sat, Jun 27, 2020, 11:33 AM Anwar AliKhan
> wrote:
>
>> So the range function actually returns BigInt (Spark SQL type)
>> and t
It does not return a DataFrame. It returns Dataset[Long].
You do not need to collect(). See my email.
On Sat, Jun 27, 2020, 11:33 AM Anwar AliKhan
wrote:
> So the range function actually returns BigInt (Spark SQL type)
> and the fact Dataset[Long] and printSchema are displaying (toString())
> Lo
So the range function actually returns BigInt (Spark SQL type)
and the fact Dataset[Long] and printSchema are displaying (toString())
Long instead of BigInt needs looking into.
Putting that to one side
My issue with using collect() to get around the casting of elements returned
by range is, I re
There are several confusing things going on here. I think this is part
of the explanation, not 100% sure:
'bigint' is the Spark SQL type of an 8-byte long. 'long' is the type
of a JVM primitive. Both are the same, conceptually, but represented
differently internally as they are logically somewhat