That is feasible, the main point is that negative scales were not really
meant to be there in the first place, so it something which was forgot to
be forbidden, and it is something which the DBs we are drawing our
inspiration from for decimals (mainly SQLServer) do not support.
Honestly, my opinion
So why can't we just do validation to fail sources that don't support negative
scale, if it is not supported? This way, we don't need to break backward
compatibility in anyway and it becomes a strict improvement.
On Tue, Dec 18, 2018 at 8:43 AM, Marco Gaido < marcogaid...@gmail.com > wrote:
>
This is at analysis time.
On Tue, 18 Dec 2018, 17:32 Reynold Xin Is this an analysis time thing or a runtime thing?
>
> On Tue, Dec 18, 2018 at 7:45 AM Marco Gaido
> wrote:
>
>> Hi all,
>>
>> as you may remember, there was a design doc to support operations
>> involving decimals with negative sc
Is this an analysis time thing or a runtime thing?
On Tue, Dec 18, 2018 at 7:45 AM Marco Gaido wrote:
> Hi all,
>
> as you may remember, there was a design doc to support operations
> involving decimals with negative scales. After the discussion in the design
> doc, now the related PR is blocked
L compliant and Hive compliant or behaving like
>> now (as Hermann was suggesting in the PR). Do we agree on this way? If so,
>> is there any way to read a configuration property in the catalyst project?
>>
>> Thank you,
>> Marco
>> --
>
i
> Inviato: 21/12/2017 22:46
> A: Marco Gaido
> Cc: Reynold Xin ; dev@spark.apache.org
> Oggetto: Re: Decimals
>
> Losing precision is not acceptable to financial customers. Thus, instead
> of returning NULL, I saw DB2 issues the following error message:
>
> SQL0802N
Losing precision is not acceptable to financial customers. Thus, instead of
returning NULL, I saw DB2 issues the following error message:
SQL0802N Arithmetic overflow or other arithmetic exception occurred.
SQLSTATE=22003
DB2 on z/OS is being used by most of biggest banks and financial intuition
Hello everybody,
I did some further researches and now I am sharing my findings. I am sorry,
it is going to be a quite long e-mail, but I'd really appreciate some
feedbacks when you have time to read it.
Spark's current implementation of arithmetic operations on decimals was
"copied" from Hive. T
Responses inline
On Tue, Dec 12, 2017 at 2:54 AM, Marco Gaido wrote:
> Hi all,
>
> I saw in these weeks that there are a lot of problems related to decimal
> values (SPARK-22036, SPARK-22755, for instance). Some are related to
> historical choices, which I don't know, thus please excuse me if I