This is at analysis time.

On Tue, 18 Dec 2018, 17:32 Reynold Xin <r...@databricks.com wrote:

> Is this an analysis time thing or a runtime thing?
>
> On Tue, Dec 18, 2018 at 7:45 AM Marco Gaido <marcogaid...@gmail.com>
> wrote:
>
>> Hi all,
>>
>> as you may remember, there was a design doc to support operations
>> involving decimals with negative scales. After the discussion in the design
>> doc, now the related PR is blocked because for 3.0 we have another option
>> which we can explore, ie. forbidding negative scales. This is probably a
>> cleaner solution, as most likely we didn't want negative scales, but it is
>> a breaking change: so we wanted to check the opinion of the community.
>>
>> Getting to the topic, here there are the 2 options:
>> * - Forbidding negative scales*
>>   Pros: many sources do not support negative scales (so they can create
>> issues); they were something which was not considered as possible in the
>> initial implementation, so we get to a more stable situation.
>>   Cons: some operations which were supported earlier, won't be working
>> anymore. Eg. since our max precision is 38, if the scale cannot be negative
>> 1e36 * 1e36 would cause an overflow, while now works fine (producing a
>> decimal with negative scale); basically impossible to create a config which
>> controls the behavior.
>>
>>  *- Handling negative scales in operations*
>>   Pros: no regressions; we support all the operations we supported on 2.x.
>>   Cons: negative scales can cause issues in other moments, eg. when
>> saving to a data source which doesn't support them.
>>
>> Looking forward to hear your thoughts,
>> Thanks.
>> Marco
>>
>>
>>

Reply via email to