Github user sarutak commented on the pull request:

    https://github.com/apache/spark/pull/2816#issuecomment-60035199
  
    @yhuai Thanks for pointing. Actually we cannot still move it out from 
ignore because 92233720368547758061.2 is BigDecimal type but num_str + 1.2 is 
Double type so comparing fails.
    In Spark SQL, floating-point-number literal is represented as Double type 
and arithmetic operation one of which operand is String results Double type.
    I think we need another discussion for the type of floating-point-number 
literal and implicit cast.
    I'll open another PR.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to