Re: Decimals with negative scale

2018-12-19 Thread Marco Gaido
That is feasible, the main point is that negative scales were not really meant to be there in the first place, so it something which was forgot to be forbidden, and it is something which the DBs we are drawing our inspiration from for decimals (mainly SQLServer) do not support. Honestly, my opinion

Re: Decimals with negative scale

2018-12-18 Thread Reynold Xin
So why can't we just do validation to fail sources that don't support negative scale, if it is not supported? This way, we don't need to break backward compatibility in anyway and it becomes a strict improvement. On Tue, Dec 18, 2018 at 8:43 AM, Marco Gaido < marcogaid...@gmail.com > wrote: >

Re: Decimals with negative scale

2018-12-18 Thread Marco Gaido
This is at analysis time. On Tue, 18 Dec 2018, 17:32 Reynold Xin Is this an analysis time thing or a runtime thing? > > On Tue, Dec 18, 2018 at 7:45 AM Marco Gaido > wrote: > >> Hi all, >> >> as you may remember, there was a design doc to support operations >> involving decimals with negative sc

Re: Decimals with negative scale

2018-12-18 Thread Reynold Xin
Is this an analysis time thing or a runtime thing? On Tue, Dec 18, 2018 at 7:45 AM Marco Gaido wrote: > Hi all, > > as you may remember, there was a design doc to support operations > involving decimals with negative scales. After the discussion in the design > doc, now the related PR is blocked

Re: Decimals

2017-12-25 Thread Ofir Manor
L compliant and Hive compliant or behaving like >> now (as Hermann was suggesting in the PR). Do we agree on this way? If so, >> is there any way to read a configuration property in the catalyst project? >> >> Thank you, >> Marco >> -- >

Re: Decimals

2017-12-22 Thread Marco Gaido
i > Inviato: ‎21/‎12/‎2017 22:46 > A: Marco Gaido > Cc: Reynold Xin ; dev@spark.apache.org > Oggetto: Re: Decimals > > Losing precision is not acceptable to financial customers. Thus, instead > of returning NULL, I saw DB2 issues the following error message: > > SQL0802N

Re: Decimals

2017-12-21 Thread Xiao Li
Losing precision is not acceptable to financial customers. Thus, instead of returning NULL, I saw DB2 issues the following error message: SQL0802N Arithmetic overflow or other arithmetic exception occurred. SQLSTATE=22003 DB2 on z/OS is being used by most of biggest banks and financial intuition

Re: Decimals

2017-12-19 Thread Marco Gaido
Hello everybody, I did some further researches and now I am sharing my findings. I am sorry, it is going to be a quite long e-mail, but I'd really appreciate some feedbacks when you have time to read it. Spark's current implementation of arithmetic operations on decimals was "copied" from Hive. T

Re: Decimals

2017-12-13 Thread Reynold Xin
Responses inline On Tue, Dec 12, 2017 at 2:54 AM, Marco Gaido wrote: > Hi all, > > I saw in these weeks that there are a lot of problems related to decimal > values (SPARK-22036, SPARK-22755, for instance). Some are related to > historical choices, which I don't know, thus please excuse me if I