Github user JasonWHowell commented on the pull request:

    https://github.com/apache/spark/pull/7847#issuecomment-133617010
  
    Thanks for the hard work here. 
    
    Did this land in documentation yet? I searched for it at 
https://spark.apache.org/docs/latest/sql-programming-guide.html but didn't find 
any reference yet.
    
    Someone is asking if there is a way to extract different granularities of 
the time unit from Spark SQL DateDiff(). Such as milliseconds vs. days vs. 
months.  I suppose we could with simple division if the DateDiff returns 
milliseconds units, and divide by 1000 and 60 and 60 up to hours units and 
again by 24 up to days units.  
    
    If we only return days units in the difference, the result would be rounded 
(losing hours and seconds precision) so multiplying x24x60x60 to get seconds is 
going to be extremely lossy.
    
    I think folks are used to the SQL Server implementation of DateDiff (where 
you provide the DatePart unit), as opposed to the Hive DateDiff and MySQL 
implementation which seems to return only rounded day parts. 
https://msdn.microsoft.com/en-us/library/ms189794.aspx
    
    Appreciate any advice, or point me to the right forum. Sorry to bother ~ 
Jason



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to